Printer Friendly

Assessing online student performance in an MBA program within AACSB assurance of learning guidelines.

INTRODUCTION

Assurance of Learning is an important part of educational programs and has become a major issue for accreditated business schools. The purpose of the Assurance of Learning process is to establish a procedure for schools to routinely identify program areas needing improvement, recommend ways to enhance the program, and close-the-loop by implementing program enhancements (Martell & Calderon, 2005). Such assessment is based on periodic review of each program's learning goals.

At the same time, business schools are increasing the use of technology that allows for new modes of course delivery. One such mode of delivery of interest to business schools is the online environment. Online enrollment has been increasing steadily over the past few years and has been attracting a new segment of students, not just cannibalizing on current enrollment (Eastman et al., 2003). Online delivery is attractive to the Generation X segment (25-39) and the Millennial segment (24 and under), two major target market segments for MBA programs (Hockberg, 2006).

The confluence of growth in online course instruction and a greater emphasis on program assurance of learning has led Business Schools to develop procedures for assessing online student performance within the framework of the AACSB guidelines for program assessment (AACSB International, 2009a; Eastman et al., 2003). The focus of this study is to develop and implement research to evaluate online student performance in the context of AACSB assessment procedures. The research discusses the AACSB program assessment procedure and course modes of delivery. Hypotheses are developed pertaining to two modes of delivery, online and face-to-face, and the research methodology and sample are described. Results comparing online student performance to face-to-face student performance are analyzed, and future research is discussed.

BACKGROUND OF THIS STUDY

MBA Program Assessment

The Association to Advance Collegiate Schools of Business (AACSB) standards provide an evaluation procedure consistent with Tyler's (1942) objectives-oriented educational evaluation (AACSB International, 2009b). For this research, AACSB assurance of learning standards are used to measure student learning outcomes and provide a roadmap for continuous improvement (Harich, Fraser, & Norby, 2005). According to AACSB International (2009c), the process for program assessment includes the following five steps:

Step 1--Define learning goals and objectives.

Step 2--Align curriculum with goals.

Step 3--Identify instruments and measures.

Step 4--Collect, analyze and disseminate assessment data.

Step 5--Use assessment data for continuous improvement.

A mission with learning goals was developed for the MBA program and aligned with the program's curriculum to ensure content coverage of these goals. Per AACSB Standard 18, an MBA is a program where student learning is at the level of 'capacity to apply knowledge' (AACSB, 2009b). Therefore, program goals need to be at least at the applied level of learning or higher.

Departing from past requirements, the new AACSB standards call for more reliance on direct measures for assessing student performance (Martell & Calderon, 2005). In keeping with the direct measure approach, many Business Schools are embedding rubrics in core courses of program curriculum to directly measure student performance.

A second departure from past AACSB guidelines is that the new standards require performance to be measured at the individual student level unless the learning goal is a team-oriented one. For the School's MBA program, all assessed student work takes the form of individual student assignments.

OBJECTIVES AND HYPOTHESIS DEVELOPMENT

The research objective is to evaluate student performance by comparing online versus face-to-face mode of delivery course sections. Terry (2007) relied on indirect measures when he compared what he referred to as the campus mode (face-to-face) and online mode. Unlike Terry's (2007) research, this research uses direct measures to assess student learning in the MBA program.

In the design of the School's MBA program, every section of each core MBA course, regardless of mode of delivery, shares a common syllabus, textbook, and group of faculty teaching that course. When comparing student performance between online and face-to-face course sections in the program, the syllabus, textbook and faculty instructors are the same across the two modes of delivery. These similarities resulted in the formulation of the first hypothesis:

H1: Students in online course sections perform equally well as those in face-to-face course sections given a common syllabus, textbook and group of faculty teaching per course.

Though the syllabus, textbook, and faculty instructors are the same for online and face-to-face classes, the difference in mode of delivery influences how students fulfill some of the course requirements and how instructors relate and communicate with students. For example, in face-to-face classes, oral class discussions are often used whereas in online classes discussion boards are often used for class participation. Face-to-face classes are synchronous, at least for the prescribed period of time the face-to-face class meets, whereas many online classes are asynchronous. Betts (2008) and Handy and Basile (2005) suggest that certain activities in a course can enhance student learning, especially when considering Bloom's (1956) Learning Levels.

Bloom (1956) proposed six learning levels, knowledge, comprehension, application, analysis, synthesis, and evaluation. To understand at what level, if any, students may vary in mastering learning between the online and face-to-face mode of delivery, the traits in the five rubrics used to measure student performance in the MBA program varied on learning level with two traits at Level 2 comprehension, seven traits at Level 3 application, and 11 traits at Level 4 analysis or higher. The concept of learning level resulted in the formulation of the second hypothesis:

H2: Student performance in online course sections will be the same as in face-to-face course sections regardless of Bloom's learning level.

RESEARCH METHODS

Research methods included assessment rubrics and their traits, sample and data collection, utilized procedure, and the measurement of the student performance. Each method was implemented according to the following procedure.

Assessment Rubrics and Their Traits

The heart of assessment is to determine whether students master the learning goals set forth for the program. Five MBA course embedded rubrics were used. Each of the five assessment rubrics included four traits, which resulted in a total of 20 traits measured. Each rubric with a corresponding student assignment was embedded in a core MBA course with the capstone Seminar in Strategic Management course having two embedded measures. The rubrics and associated traits are included in the appendix to this paper.

The five rubrics were developed by the School's Curriculum Planning and Assessment Committee in conjunction with the faculty members teaching the courses where the measures were embedded.

Sample and Data Collection

All student assignments in both online and face-to-face sections of four MBA courses with embedded measures were collected over one semester. In total, 217 student assignments were assessed, 78 student assignments in face-to-face course sections and 139 student assignments in online sections. Each assignment was assessed to determine student performance in four traits corresponding to the rubric being measured. The total number of items assessed was 868 (217 times 4).

Utilized Procedure

Faculty teams of 2-4 members assessed the student assignments in each course with an embedded measure. Both online and face-to-face section student coursework was assessed by the same group of faculty. Student names were omitted from the assignments, and online student performance scores were complied separately from face-to-face student performance. A large portion of the total business school's faculty body was involved in the assessment.

Measurement of the Student Performance

The faculty determined graduates are expected to perform at a level meeting or exceeding that which is set forth in the learning goals of the MBA program. Therefore, student performance was measured on a three point scale of below expectations, meets expectations and exceeds expectations. Since this was the first time the MBA program was assessed, the overall goal of the Business School was that 75 per cent of students would meet or exceed expectations in each trait.

RESULTS OF THIS STUDY

Table1 shows the number of assessed items where students performed below expectations and the number of assessed items where students met or exceeded expectations. In total, there is a significant difference (p < .01) between student performance in face-to-face versus online classes. Therefore, H1 is rejected.

The analysis of student performance on the total traits does not determine if the level of learning influenced student performance. Table 1 also shows the division of student performance by Bloom's learning level. When analyzing how students performed by Bloom's learning level, online students did equally well as face-to-face students in Bloom's level 2 (p = .66) and level 3 (p = .75) learning. However, online students did not do as well as face-to-face students in Bloom's level 4 learning (p < .001). Therefore, H2 is supported for Bloom's learning levels 2 and 3 but not supported for Bloom's level 4 learning.

CLOSING THE LOOP

The point to any assessment program is to create a vehicle to help continuously improve academic programs. Even when student performance meets or exceeds the expectation goals of the school, the assessment procedure can identify areas that, if enhanced, will likely lead to improvements in the academic program. Using the comparative information in Table 1 and suggestions of researchers Palvia and Palvia (2007) and Handy and Basile (2005), the Business School recommended that instructors add or enhance areas in their online course sections to improve student performance in Bloom's level 4 and higher learning. Such suggestions include:

1) Make written assignment instructions more explicit. (Instructors in face-to-face courses tend to hand out assignment instructions in class and then go over the assignment with students and orally give more thorough explanations and examples about the assignment to students. This extra explanation step may be missing in many online classes.)

2) Provide students with a sample paper to illustrate expectations.

3) Use the discussion board to cover similar assignments and activities. (This is often done in face-to-face courses during oral class participation sessions.)

4) Add synchronous Webinar(s) to clarify the assignment, the assignment expectations, and cover similar activities or cases.

During the next MBA program assessment period, student assignments will be collected and assessed to determine whether the course enhancements outlined above improved student performance in the online course sections.

LIMITATIONS OF THIS STUDY

The results of this study are from an assessment of one MBA Program. Different graduate programs and programs in other schools may have dissimilar results. Because of the limited sample size and first time use of the assessment rubrics, caution is recommended when generalizing the study's results. To address some of these issues, future plans include redoing the assessment of the MBA Program using student assignments from other semesters and using this assessment method to compare online versus face-to-face student performance in other school programs.

CONCLUSION

This research compared student performance per mode of delivery. The study assessed student performance in online course sections versus face-to-face course sections of an MBA program using the AACSB program assessment procedure. The research included the use of individual student assignments, the use of direct assessment measures, and the assessment of multiple student learning levels. The research found that student performance in online course sections was about the same as student performance in face-to-face course sections for Bloom's level 2 and 3 learning. However, student performance in online sections was not the same as those in face-to-face sections for Bloom's level 4 learning.

SUGGESTIONS FOR FUTURE RESEARCH

The results of this study suggest several areas of further research including which online course enhancements deliver better outcomes, how synchronous and asynchronous aspects of online course delivery impact student learning, and what are the best ways to close the gap between online and face-to-face student learning (especially in Bloom's level 4 and higher learning).
Appendix
Rubrics for MBA Program Learning Goals

Learning Analyze Analyze Analyze
Goal--MBA business business business
graduates operations and operations and operations and
will be able processes processes processes
to:

MBA course Financial Management Operations
where Management Information Management
assessment Course Systems Course Course
measure is
embedded

Trait #1 Calculate the Describe Identify the
 weighted firm-based operational
 average cost value chain problems in
 of capital model and the business
 decision-making situation
 levels

Trait #2 Calculate Apply Analyze
 variables value-chain appropriate OM
 needed to model and concepts and
 forecast free decision-making tools
 cash flows level
 identification
 to the
 specific firm's
 situation

Trait #3 Apply the Analyze Apply OM
 corporate opportunities concepts and
 valuation model in terms of tools to
 functional improve the
 areas, business business
 process(es) and situation
 decision levels
 for IS/IT
 implementation
 for the firm.

Trait #4 Compare Analyze the Evaluate
 valuation to matching alternative
 market functionality solutions
 conditions of the IS/IT
 and make product(s)
 recommendations

Learning Analyze changes Apply cross-
Goal--MBA in the business functional
graduates environment to approaches to
will be able develop organizational
to: strategies that issues
 respond to
 emerging
 opportunities
 and threats

MBA course Seminar in Seminar in
where Strategic Strategic
assessment Management Management
measure is Course Course
embedded

Trait #1 Analyze the Analyze the
 external strategic
 environmental issues of the
 forces and situation by
 driving forces business
 causing discipline
 industry change

Trait #2 Analyze the Evaluate the
 economic firm's prices
 characteristics and costs using
 of the industry value chain
 approach

Trait #3 Prepare a Prepare a
 profitability SWOT analysis
 analysis of the
 industry

Trait #4 Analyze the Analyze the
 potential firm's
 strategic moves financial
 of competitors performance


REFERENCES

AACSB International. (February 7, 2009a). Overview of Assessment-Why Do Assessment? retrieved from http://aacsb.edu/resource_centers/assessment/ overview-why.asp.

AACSB International. (February 7, 2009b) "Eligibility Procedures and Accreditation Standards for Business Accreditation," retrieved from

http://www.aacsb.edu/accreditation/process/documents/ AACSB_STANDARDS_Revised_Jan08.pdf.

AACSB International. (February 7, 2009c) "The Assessment Process," retrieved from http://www.aacsb.edu/resource_centers/assessment/overview-process.asp.

Betts, S. C. (2008). Teaching and Assessing Basic Concepts to Advanced Applications: using Bloom's Taxonomy to Inform Graduate Course Design. Academy of Educational Leadership Journal, 12 (3): 99-106.

Bloom, B. (1956) Taxonomy of Educational Objectives, Handbook I: Cognitive Domain. New York: Longman, Green and Company.

Eastman, J. K., Swift, C.O., Bocchi J., Jordan R., & McCabe, J. (2003). The Challenges and Benefits of Teaching MBA Courses Online: Lessons from the Georgia WEBMBA[TM]. AMA Winter Educators Conference Proceedings, 14, 153-154.

Handy, S. A., & Basile A. (2005). Improving Accounting Education Using Bloom's Taxonomy of Educational Objectives. Journal of Applied Research for Business Instruction, 3 (3): 1-6.

Harich, K. R., Fraser L., & Norby J. (2005). Taking the Time to Do It Right: A Comprehensive, Value-added Approach for Assessing Writing Skills", Assessment of Student Learning in Business Schools: Best Practices Each Step of the Way, eds. K. Martell and T. Calderon. The Association for Institutional Research and AACSB International, 1 (2): 119-138.

Hockberg, J. (2006). Online MBA Programs: Emulating Global Business. Distance Education Report, 10 (2): 7-8.

Martell, K., & Calderon, T. G. (2005). Assessment in Business Schools: What It Is, Where We Are Going, and Where We Need to Go Now," Assessment of Student Learning in Business Schools: Best Practices Each Step of the Way, eds. K. Martell and T. Calderon. The Association for Institutional Research and AACSB International, 1 (1): 1-26.

Palvia, S. C., & Palvia, P. C. (2007). The Effectiveness of Using Computers for Software Training: An Exploratory Study. Journal of Information Systems Education, 18 (4): 479-489.

Terry, N. (2007). Assessing Instruction Modes for Master of Business Administration (MBA) Courses. Journal of Education for Business, March/April, 220-225.

Tyler, R. W. (1942). General Statement of Evaluation. Journal of Educational Research, 35 (4): 492-501.

About the Authors:

Linda A. Hayes is Associate Professor and Director of Program Assessment and Online Services at the University of Houston-Victoria School of Business Administration. She received a B.S.M.E. from Clarkson University, an M.B.A. from University of Houston, and a Ph.D. from University of California at Berkeley. Dr. Hayes has 15 years of industry experience. Her research interests include decision-making, judgment and perceptions. Dr. Hayes was a NASA Summer Faculty Fellow. Recently, she has published in the Journal of Business Ethics, Journal of Management Development, Journal of International Marketing, and International Journal of Innovation and Learning.

June Lu is Associate Professor of Management at University of Houston-Victoria. She received her doctorate from the University of Georgia. Her specialties are management information systems and e-commerce. Her research spans wireless mobile technology acceptance and electronic/mobile commerce in different cultural settings, and effectiveness of using online learning tools in MIS education. She has published articles in Information & Management, Journal of Strategic Information Systems, Electronic Commerce Research, Journal of Internet Research, Journal of Computer Information Systems, International Journal of Innovation and Learning, and other journals.

Linda A. Hayes

June Lu

University of Houston--Victoria
Table 1
Assessment of Student Performance in Face-to-Face Compared to
Online Course Sections MBA Program

 Face-to-Face Course
 Sections Online Course Sections

 (N = 78 student (N = 139 student
 assignments assignments
 with 312 items assessed) with 556 items assessed)

Bloom's Meets or Meets or
Level of Below Exceeds Below Exceeds
Learning Expectations Expectations Expectations Expectations

Level 2 1 21 (95%) 6 53 (90%)

Level 3 16 86 (84%) 34 161 (83%)

Level 4 21 167 (89%) 75 227 (75%)

Total all 38 274 (88%) 115 441 (79%)
Learning
Levels

Bloom's
Level of Chi-Square
Learning Value

Level 2 p = .66

Level 3 p = .75

Level 4 p < .001

Total all p < .01
Learning
Levels
COPYRIGHT 2010 International Academy of Business and Public Administration Disciplines
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2010 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Hayes, Linda A.; Lu, June
Publication:International Journal of Education Research (IJER)
Article Type:Report
Geographic Code:1USA
Date:Jan 1, 2010
Words:2853
Previous Article:A factor analysis of student responses and perceptions of ethical conduct in business.
Next Article:Preparing for uncertainty? Paradoxical challenges for management education.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters