Printer Friendly

The effect of teaching technology on the performance and attitudes of accounting principles students.

ABSTRACT

Improvements in computer technology and audio-video equipment have allowed accounting faculty to significantly change the way accounting information is delivered to students. Presentation software, such as PowerPoint, allows faculty to build well-designed slides that would be impossible with the traditional chalk, blackboard and overhead method. The research question examined in this paper is whether PowerPoint presentations significantly increase student performance and attitudes in the principles of accounting course. The results display no significant statistical differences in either student performance or attitudes between those sections taught using PowerPoint and those taught using the traditional method.

INTRODUCTION

Computer technology allows an accounting faculty member to change the delivery method of the accounting information provided to students. Recently developed teaching technology enables faculty to include graphics, scanned images, animation, sound, and access to Internet web sites. Presentation software, such as PowerPoint, allows faculty members to build slides in a well-designed format that would be virtually impossible to duplicate using the chalk and blackboard method. However, this new teaching methodology does not come without considerable cost. There are enormous costs for hardware, software, and faculty training and preparation time. Considering the substantial direct and indirect costs of "teaching" technology, a very important question is "Do teaching technologies actually help students learn?" The major objectives of this research project were to determine whether a particular teaching methods results in measurable performance effects among accounting principles students and whether there is an effect in the attitude of the students toward the accounting profession and toward their instructor.

PREVIOUS RESEARCH

Recently the pressure on university administrators and faculty to increase the level of technology in the classroom has intensified. Referred to as an academic technological arms race (Jackson, 2000), universities are routinely expected to compete for rankings such as Yahoo! Internet Life's "most wired colleges." This pressure has become a centerpiece of debate on college campuses as can be seen by looking at virtually any edition of the Chronicle of Higher Education in the past five years (see for instance http://chronicle.com/infotech/wiredcampus.xml for historical references to these articles).

Previous research, primarily in the field of education, has demonstrated the potential of multimedia to enhance the learning process. Shank (1995) argued that multimedia-based instruction theoretically should promote student learning and retention. Many others (Pea and Gomez, 1992; Drook, 1994; Roberts, 1994; Liebowitz and Letsky, 1996; Jategaonkar and Babu, 1995) have developed successful applications of multimedia technology in the learning process.

Liu and Beamer (1997) provided anecdotal evidence that suggested positive pedagogical effects of multimedia in the classroom due to the ability to "attract, hold the attention, and spark the imagination of" the viewer. However, Becker and Watts (1998) suggest that most academic professors have not kept pace with this technology sufficient to employ these advantages. From evidence in the economics field, Greenlaw (1999) found that the only definitive outcome was an increase in the initial workload of faculty for new course preparations.

Over a decade ago, the Accounting Education Change Commission (1992, 1993) provided guidance that recommended, among other things, the use of materials in the first accounting course that "enhance presentation ... consistent with current developments and new technology in the field...." Specific to the field of accounting education, Jensen and Sandlin (1992a and 1992b) strongly supported the use of technology in accounting education and provided information for those faculty interested in using multimedia approaches in the classroom.

Research into the effect of multimedia in the accounting classroom is very limited. Landry et al. (1997) reported that ten percent of the faculty at teaching universities uses multimedia and only four percent of those at research institutions use it. The most common type of multimedia used was videos. Rand et al. (1997) found that a multimedia presentation method had a significant effect on students' test scores immediately following a classroom presentation. This positive effect disappeared after two weeks. The multimedia presentation did not affect the students' evaluation of the topics or of the faculty members.

The Chronicle of Higher Education (Young, 2004) recently summarized the most comprehensive study to date regarding technology and teaching. In a survey conducted by the Educause Center for Applied Research, 4,374 students at 13 colleges of all types provided their perceptions of technology in the classroom. According to the report, 48.5 percent of the respondents said the biggest benefit of classroom technology is convenience, while only 12.7 percent of the students said improved learning was the greatest benefit. Researchers who conducted the study specifically asked students to comment on professors' use of PowerPoint slide shows. Generally, the respondents were negative complaining that "faculty tend to read PowerPoint slides rather than teaching from them," leading the interviewer to conclude that PowerPoint used badly makes a lecture worse.

The only two business-specific studies which isolate the effects of computer-generated slide presentations (PowerPoint) upon student performance and interest were conducted by Hagen et al. (1997) in strategic management, and Rankin & Hoaas (2001) in the field of economics. Hagen et al. conducted an experiment with management students to determine if the inclusion of computer-aided presentations affected students' satisfaction, participation, and performance and found that all three variables were positively affected by the presentations. In a more specific experiment, Rankin and Hoaas (2001) observed 69 Principles of Economics students across two semesters and four class sections, half of which were exposed to PowerPoint presentations and half of which were exposed only to the traditional lecture method. Their results found no statistical effect on student performance, student attitudes towards the subject matter, or student evaluations of the instructor. Since the results of these two studies are contradictory and neither involved accounting courses, a well-controlled study focusing on accounting education would be beneficial.

HYPOTHESES

Drawing from this inconclusive literature, this experiment is designed to test hypotheses relating course presentation method to the performance and attitudes of accounting principles students. Thus, the model tested is:

Student Performance, Student Attitude f (Course Presentation Method) + e Formula (1)

Stated in the null, these hypotheses are:

H1: Student performance in an accounting principles course will not be affected by the instructor's presentation method of course material.

H2: Student attitude in an accounting principles course will be not be affected by the instructor's presentation method of course material.

Where:
 Student performance is measured by individual examination results
 and grade distributions; Student attitude is measured by course
 evaluations and additional survey results, and Instructional
 presentation method is either primarily "chalk and talk" or
 Microsoft PowerPoint based delivery methods.


Since PowerPoint controls 93% of the presentation software market (Rankin & Hoaas, 2001), this study utilized it as the vehicle of study within the context of performance and attitudes in the first accounting course.

METHODOLOGY

An experiment was conducted in four accounting principles sections across two consecutive semesters. In this study, two types of teaching methods were compared: (1) traditional lecture using a blackboard for illustrations and transparencies on an overhead projector for problems and exercises, and (2) multimedia lecture using PowerPoint for lectures, problems and exercises. Prior to each of these lectures, copies of all slides were made available to the students in the PowerPoint sections only.

The experiment was conducted at a comprehensive regional (13,000 students) public university in the Southeastern United States. The sophomore level course, entitled "Principles of Financial Accounting," is the first accounting course offered, and is required of all Business majors as well as students majoring in Dietetics, Construction Management and Information Systems. The multimedia materials used in the course were developed during the summer term preceding the experiment, and the courses were taught during the following spring and fall academic terms. The PowerPoint presentation package accompanying the course text was modified and customized for the course.

The classroom in which the four sections were held was equipped with chalkboards, a traditional overhead machine, and "technology podium" at the front of the room. The podium contained a computer, VCR, DVD, and document camera. Multimedia presentations could be projected on a wide screen which could be lowered over the chalkboard.

During the first experimental academic term, the two course sections were taught in back-toback time blocks in the same classroom. The first section was taught using the traditional blackboard approach, and the second section was taught using PowerPoint. During the second experimental term, the order was controlled by again teaching two sections consecutively, with the PowerPoint section taught first and the traditional section taught second. Each of the paired sections covered identical material with the same instructor, the same textbook and identical examinations. In addition, the classrooms for the paired sections were the same, the syllabi were identical, the quizzes were identical, and the grading scale was the same for both. The controls used in this study are shown in Table 1.

There were initially 154 students enrolled in the four test sections, of which 99 received final course grades, resulting in an attrition rate of 35.7%. This attrition rate is typical of the course, and did not significantly differ between the sections studied or from the departmental norm during, before, and after the study period.

The experiment was conducted during semesters when 20 Principles sections (11 in the fall and 9 in the spring) were taught by nine separate instructors. Aside from a common textbook, instructors are completely independent with respect to teaching and testing styles. The students in the experimental sections were not informed that the professor was utilizing a different teaching approach in the match-paired control sections.

HYPOTHESES TESTING

Student Performance

Four examinations were given at equally spaced intervals throughout the test semesters. The three exams given during the term had a maximum score of 200 points each. The comprehensive final exam had a maximum score of 250 points. Each exam was comprised of approximately 50% multiple choice questions and 50% problem-based exercises. These four examinations totaled 85% of the final course grade.

Hypothesis one was first tested by comparing the mean scores for each exam between the match-paired course sections. T-tests were performed on the four examinations given during the fall term in the traditional section and the four examinations in the "PowerPoint" section to observe whether there was a significant difference in the means. This test was repeated during the spring term. The results of these eight t-tests are shown in Tables 2 and 3. As shown in the Tables, none of the t-statistics was significant at the .05 level of significance. There were no significant differences between the mean examination scores for any of the eight pairs of examinations given during the experimental fall and spring terms.

The final grade distributions in the two pairs of classes were also examined using the chi-square test. The grades given in each of the sections were in the traditional "A,B,C,D, & F" format. The chi-square statistic measures whether there is a difference between sections in the distribution of these grades. These results are shown in Table 4. There was no significant difference between the grade distributions in the two pairs of classes. The traditional approach and the PowerPoint approach resulted in similar final grade distributions (The grade distributions for Principles of Accounting were examined for terms prior to the two-semester research period and subsequent to the research period. The distribution of grades during these periods was not significantly different from the grade distributions during the research period.).

Given that neither individual examination scores nor overall grade distributions were significantly different between the experimental and control class sections, we fail to reject hypothesis one. Accordingly, no evidence could be found that student performance in the course was differentially affected by the inclusion or exclusion of PowerPoint presentations.

Student Satisfaction

At the end of each semester, students in each section were given two forms of satisfaction surveys to complete. The first survey was developed by the instructor and consisted of six attitudinal questions about accounting as a profession / subject matter in general. The six questions were asked using a five-point Likert scale and are shown in Table 5.

Table 6 presents the Chi-square test utilized to determine between-section differences in these attitudes. There were no significant differences between the distribution of responses for the paired sections for either the fall or the spring experimental terms.

Finally, the effect of the delivery system (PowerPoint) on the students' evaluations of the faculty member was examined. Since 1996, the applicable State Legislature enacted legislation mandating that all State University System faculties administer the eight-question State University System Student Assessment of Instruction (SUSSAI) form in every section of every course taught (Chancellor's Memorandum, 1995). The results are public information. The mandatory SUSSAI questionnaire is shown as Table 7.

Specific instructions are given regarding the administration of this common evaluation form. It must be administered at the beginning of class near the end of the term, with the faculty member being evaluated absent during the evaluation period. Results of the evaluations are not made available to the faculty member until after course grades have been assigned.

Table 8 shows chi-square test results for the paired sections for the fall and the spring terms. During the fall term, students viewed the PowerPoint section more positively than the traditional section. The distribution of the student evaluations was significantly different for Questions 1, 2 and 8. For all three questions, the evaluations of the students in the PowerPoint section were higher. On Question 1--Description of course objectives and assignments, over 44% of the students in the PowerPoint section responded "Excellent," while none of the students in the traditional section responded likewise. For Question 2--Communication of ideas and information, over 22% of the students in the PowerPoint section responded "Excellent," while none of the students in the traditional section responded likewise. For Question 8 - Overall rating of instructor, over 27% of the students in the PowerPoint section responded "Excellent," while none of the students in the traditional section responded "Excellent."

For the spring term, there were no significant differences between the two sections. When fall and spring were combined, there were no significant differences between the distributions in the two types of sections. While there may have been a mild effect upon student evaluations of the instructor, it appears, however, that the delivery method did not significantly and consistently affect the student's evaluation of the instructor of the course.

Based upon the combined results of the two satisfaction instruments across all four sections of the course, we fail to reject the null of hypothesis two. Accordingly, little evidence could be found that student satisfaction in the course was differentially affected by the inclusion or exclusion of PowerPoint presentations.

CONCLUSION

The education literature is replete with anecdotal but conflicting evidence that technological innovations in the classroom affect student performance and attitudes about the subject matter being presented. While the debate will continue as to the direction and strength of these effects, the objective of this study is to lend empirical support to this discussion.

Specifically, the sophomore level Principles of Accounting course was examined to test whether the inclusion of PowerPoint presentations affects student performance and attitudes. In this study, no discernible effects could be supported that PowerPoint presentations have any impact on student outcomes.

While the study controlled for many possible factors, it did not control for the individual faculty member. Different faculty members may find differences because of their unique teaching style. It should also be noted that this study only examined the use of PowerPoint slides. It is possible that the use of other means of delivery and course management such as videos, Internet connections, Blackboard, WebCT, etc., could result in significant differences.

The results of this study do, however, call into question the rush to bring more and more technology into the classroom. As illustrated in the introduction, technology is very expensive, both in terms of outlay costs and opportunity costs. For instance, in the authors' university, traceable costs for the business building alone approach $500,000 per year.

In this study, existing PowerPoint slides were adapted for classroom use. An average of over ten hours per week was expended in adapting these existing slides. Preparing one's own slides would obviously require a significantly greater time commitment. Perhaps it is time to stand back and carefully examine the effect of these new, costly delivery systems. If they do not make a difference, then the question is "Why incur the additional cost?"

REFERENCES

Accounting Education Change Commission (1992). The first course in accounting: Position Statement no. 2. Issues in Accounting Education, 7(2), 249-251.

Accounting Education Change Commission (1993). Evaluating and rewarding effective teaching: Issues Statement No. 5. Issues in Accounting Education, 8(2), 436-439.

Agarwal, R. & Day, A.E. (1998). The impact of the internet on economic education. Journal of Economic Education, (Spring), 99-110.

Becker, W. & Watts, M. (1998). Teaching economics: what was, is, and could be. Teaching Economics To Undergraduates: Alternatives to Chalk and Talk. Northampton, MA: Edward Elgar Press, 1-10.

Chancellor's memorandum 95-06.1, State of Florida State University System, June 1995.

Drook, G. (1994). Transcript of keynote address Washington interactive multimedia 1994, Journal of Instructional Delivery Systems, (Fall), 3-9.

Greenlaw, S. (1999). Using groupware to enhance teaching and learning in undergraduate economics. Journal of Economic Education, (Winter), 33-42.

Hagen, A., Edwards, K., & Brown, A. (1997). The impact of technology on classroom presentations: an empirical investigation. The Journal of Private Enterprise, (Spring), 54-68.

Jackson, G. (2000). The wired campus: enough is enough. The Chronicle of Higher Education, (August 11) B8-B9.

Jategaonkar, V.A., & Babu, A.J.G. (1995). Interactive multimedia instructional systems: a conceptual framework. Journal of Instructional Delivery Systems, (Fall), 24-29.

Jensen, R.E. & Sandlin, P.K. (1992a). Why do it? advantages and dangers of new waves of computer-aided teaching/instruction and what software features to consider. Journal of Accounting Education, (Spring), 25-38.

Jensen, R.E. & Sandlin, P.K. (1992b). How to do it? getting started in authoring computer-aided teaching/instruction and what software features to consider. Journal of Accounting Education, (Spring), 39-60.

Landry, R.M., Case, T., & Francisco, W. (1997). Multimedia in accounting: a survey of its use and relationship to a school's teaching/research mission. 1997 Proceedings of the Annual Meeting of the Southeastern American Accounting Association, 181-191.

Liebowitz, J. & Letsky, C. (1996). KARMA: The knowledge acquisition reference multimedia aid. T.H.E. Journal, (February), 85-87.

Liu, D. & Beamer, L. (1997). Multimedia as a teaching tool in business communication course delivery. Business Communication Quarterly, (June), 51-66.

Pea, R.D. & Gomez, L.M. (1992). Distributed multimedia learning environments: why and how? Interactive Learning Environments, (February), 73-109.

Rand, R.S., Galbreath, S.C., & Snodgrass, P. (1997). Marriage or divorce: Can multimedia and accounting unite to enhance student performance? 1997 Annual Meeting of the Southeastern American Accounting Association, 192.

Rankin, E. L. & Hoaas, D. (2001). Teaching note: does the use of computer-generated slide presentations in the classroom affect student performance and interest? Eastern Economic Journal, (Summer), 335.

Roberts, B.C. (1994). Information highways: delivery and shaping the multimedia world of tomorrow. Executive Speeches, (April/May), 4-6.

Shank, R. Active learning: integrating multimedia and artificial intelligence. Keynote Address at the International Symposium on Artificial Intelligence. Monterrey, Mexico, October 18, 1995.

Young, J. R. (2004). Students say technology has little impact on teaching. The Chronicle of Higher Education, (August 13), A28.

Homer L. Bates, University of North Florida

Bobby E. Waldrup, University of North Florida
Table 1: Controls Used in Study

1 Same Instructor
2 Same Textbook
3 Same Syllabi and Assignments
4 Same Examinations
5 Same Classroom
6 Same Grading Scale

Table 2: Student's t-Test Results of Exam Scores--Fall--Semester 1

 95%
 confidence
 Standard interval
 N= Mean deviation for mean t= Probability=

Exam 1
Traditional 44 131 21 124-138 1.00 0.32
PowerPoint 38 136 23 129-143 1.00 0.32
Exam 2
Traditional 30 122 34 107-136 0.606 0.55
PowerPoint 31 128 46 114-142 0.606 0.55
Exam 3
Traditional 22 141 27 128-155 0.774 0.46
PowerPoint 21 134 36 120-148 0.774 0.46
Final Exam
Traditional 22 180 32 162-198 1.33 0.19
PowerPoint 21 163 51 144-182 1.33 0.19

Table 3: Student's t-Test Results of Exam Scores--Spring-Semester 2

 95%
 confidence
 Standard interval
 N= Mean deviation for mean t= Probability=

Exam 1
Traditional 38 144 26 135-153 1.10 0.28
PowerPoint 37 137 31 127-146 1.10 0.28
Exam 2
Traditional 30 129 27 118-139 0.026 0.979
PowerPoint 30 128 22 118-139 0.026 0.979
Exam 3
Traditional 29 150 27 140-159 0.238 0.813
PowerPoint 28 151 22 142-161 0.238 0.813
Final Exam
Traditional 27 178 32 164-192 0.934 0.354
PowerPoint 29 169 51 155-182 0.934 0.354

Table 4: Chi-square Test Results for Grade Distribution

Courses compared using
the Chi-square Test on Chi-square Significant at the
A, B, C, D, F, W Degrees of Statistic 95% Confidence
Distribution Freedom Computed Level or Above

Fall Term PowerPoint/ 5 2.81 No Critical
 Traditional Value: 11.07
Spring Term PowerPoint/ 5 1.79 No Critical
Traditional Value: 11.07

Table 5: Questions Asked to Determine Student Attitudes

 Strongly Strongly
 Disagree Agree

1 I hate accounting 1 2 3 4 5
2 I enjoy thinking about accounting. 1 2 3 4 5
3 I am no good at accounting. 1 2 3 4 5
4 Accounting is a useful subject. 1 2 3 4 5
5 Accounting has little appreciation 1 2 3 4 5
 in my life.
6 Accounting is an exciting subject. 1 2 3 4 5

Table 6: Chi-square Test Results for Attitude Questions

 Chi-Square Significant at the 95%
 Degrees of Statistic Confidence Level or
Questions Compared Freedom Computed Above

Fall--Semester 1
 Question 1 4 4.61 No *
 Question 2 4 2.77 No *
 Question 3 4 0.71 No *
 Question 4 4 4.09 No *
 Question 5 4 1.90 No *
 Question 6 4 4.73 No *
Spring--Semester 2
 Question 1 4 6.06 No *
 Question 2 4 5.04 No *
 Question 3 4 1.66 No *
 Question 4 4 4.87 No *
 Question 5 4 2.77 No *
 Question 6 4 2.66 No *

* Critical Value is 9.49

Table 7: State University System Student Assessment of Instruction
(SUSSAI)

Directions: Please assess your instructor's performance on the
following eight items by darkening one response for each. Respond to
each item according to the CODE printed below:

E = Excellent VG = Very Good G = Good F = Fair P = Poor

1 Description of course objectives and E VG G F
 assignments.
2 Communication of ideas and information. E VG G F
3 Expression of expectations for E VG G F
 performance in this class.
4 Availability to assist students in or E V G F
 out of class.
5 Respect and concern for students. E V G F
6 Stimulation of interest in the course. E V G F
7 Facilitation of learning. E V G F
8 Overall rating of instructor. E VG G F

 NR = No
E = Excellent VG = Very Good G = Good Responese

1 Description of course objectives and P NR
 assignments.
2 Communication of ideas and information. P NR
3 Expression of expectations for P NR
 performance in this class.
4 Availability to assist students in or P NR
 out of class.
5 Respect and concern for students. P NR
6 Stimulation of interest in the course. P NR
7 Facilitation of learning. P NR
8 Overall rating of instructor. P NR

Table 8: CHI-SQUARE TEST RESULTS FOR SUSSAI QUESTIONS

 Degrees Chi-Square Significant at the 95%
Questions Compared of Statistic Confidence Level or
 Freedom Computed Above
Fall--Semester 1
 Question 1 4 11.40 Yes *
 Question 2 4 11.80 Yes *
 Question 3 4 6.13 No *
 Question 4 4 2.83 No *
 Question 5 4 6.41 No *
 Question 6 .5 No *
 Question 7 4 6.87 No *
 Question 8 4 13.75 Yes *
Spring--Semester 2
 Question 1 4 3.62 No *
 Question 2 4 3.56 No *
 Question 3 4 2.93 No *
 Question 4 4 2.36 No *
 Question 5 4 0.36 No *
 Question 6 4 4.60 No *
 Question 7 4 2.07 No *
 Question 8 4 1.03 No *
Fall & Spring Semester Combined
 Question 1 4 6.42 No *
 Question 2 4 8.79 No *
 Question 3 4 0.93 No *
 Question 4 4 3.42 No *
 Question 5 4 4.62 No *
 Question 6 4 5.18 No *
 Question 7 4 5.57 No *
 Question 8 4 8.49 No *

* Critical Value is 9.49

Table 9: Classroom Technology Costs

Direct Cost
 Podium $65,000
 Hardware $19,500
 VCR/DVD $5,200
 Projector's A.V. piece $110,500
 Camera $39,000
 Bulbs $16,250
 Software licenses $1,300
Total Direct Costs $256,750
Allocated Costs
Tech support $250,000
Total Associated Costs $506,750
COPYRIGHT 2006 The DreamCatchers Group, LLC
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006 Gale, Cengage Learning. All rights reserved.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Author:Bates, Homer L.; Waldrup, Bobby E.
Publication:Academy of Educational Leadership Journal
Geographic Code:1USA
Date:Sep 1, 2006
Words:4130
Previous Article:Confirmatory factor analysis of the Principal Self-Efficacy Survey (PSES).
Next Article:Active learning: an empirical study of the use of simulation games in the introductory financial accounting class.
Topics:

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters