Printer Friendly

ENSURING ACADEMIC INTEGRITY IN ONLINE COURSES A Case Analysis in Three Testing Environments.

INTRODUCTION

The issue of student identification and authentication is now an essential area of compliance within federal policy and law. Providers of online education must develop policies and procedures for verifying that the same student who signed up for the course does the work and receives the academic credit. Protecting the integrity of online courses and programs, and satisfying the accrediting agencies responsible for enforcing the law, requires investment of time and resources into the prevention and detection of academic dishonesty.

FEDERAL COMPLIANCE

In August 2008, Congress passed the Higher Education Opportunity Act, reauthorizing the 1965 Higher Education Act, as amended. In a section addressing accreditation and program integrity, 34 C.F.R. [section] 602.17(g), the law states in pertinent part:
[The agency] Requires institutions that offer distance education... to
have processes in place through which the institution establishes that
the student who registers in a distance education... course or program
is the same student who participates in and completes the course or
program and receives the academic credit.


The statute specifically requires institutions to verify the identity of a student who participates in a class or coursework by using methods such as: secure login and passcode; proctored examinations; and, new or other technologies and practices that are effective in verifying student identity.... 34 C.F.R. [section] 602.17 (g)(1) (i-iii).

WAYS TO ENSURE ACADEMIC INTEGRITY

Educators in higher education use a variety of methods and tools to ensure academic integrity in online courses. Creating a clear academic dishonesty policy and making them availability to students is the first step (Simonson, Smaldio, Albright, & Zvacek, 2003). Utilizing authentic assessment strategies, such as collaborative projects, and e-portfolios is effective (Bobak, Cassarino, & Finley, 2005). Requiring an outline and a draft for individual written projects prevents students from submitting others' work at the end of a semester. Wikis are a great tool in the implementation of this process. Many software applications like Turnitin and SafeAssign can also detect plagiarism by comparing students' essays to electronic databases.

Proctored examinations are another way to verify student identification. Whether it is prepackaged, using lockdown browser and recorded video features in the learning management system, or proctored with a human proctor at a distance via video streaming, consideration must be given to convenience, affordability, minimal hardware requirements and, most importantly, security.

Prince, Fulton, and Garsombke (2009) found significant differences in average test scores between tests taken electronically without a proctor and those administered using a live or a remote proctor. Students scored significantly lower on proctored exams versus nonproctored exams. Cochran, Troboy, and Cole (2010) found that the grades for the remote proctor group trended lower than the nonparticipants. They also found that remote proctor participants felt that the remote proctoring system had no impact on their exam taking abilities.

To meet the need of online education in online testing, many vendors have developed tools to provide proctoring services. This study involved Respondus Monitor, Respondus LockDown Browser, and Blackboard Test Tools available to faculty and students in the university that the study was conducted.

Nonproctored Recorded Online Testing Environment

Respondus Monitor is a companion product for LockDown Browser that enables students to take online exams in nonproctored environments ("Respondus Monitor," 2015). There is no additional software to install, and students use their own computers with a standard webcam to record assessment sessions. Instructors can review the entire session from Blackboard course sites. If used, students enrolled in the course need to pay a flat fee per course per semester. Institutions can also purchase seats and allow students to use it for free.

Nonproctored Lockdown Online Testing Environment

Respondus LockDown browser is a custom browser that locks down the testing environment and can be integrated with the learning management system. When students are required to use the LockDown Browser when taking a test, they are unable to print, copy, go to another website, or access other applications using the same computer. When a test is started, students are locked into it until they submit it for grading ("LockDown Broswer," 2015).

Nonproctored Online Testing Environment

The Blackboard quiz/survey tool allows faculty to create timed tests with random selections pulled from a pool of questions. On the test day students will be able to take the test in the time frame that was set by the faculty. The questions in the test can be different from student to student since they can be randomly selected from the question pool. When time expires, the system will shut down the test.

The Blackboard quiz tool allows faculty to control when the results of the test are released to students, as well as the type of results and feedback that are released. Instructors are encouraged to consult with an instructional designer to discuss options and settings that will meet the goals of individual assessments, and the course as a whole.

ABOUT THE COURSE IN THE STUDY

This course on auditing was a cross-listed, offered to both undergraduate and graduate students, and open to all students at the university. The same faculty member has offered this course online for several semesters. The overall goal of the course was to help prepare students for a professional career either working as an auditor or working with auditors. The course introduced numerous professional topics such as the importance of auditing, management assertions, risk, evidence, and reporting and professional liability. The material covered is tested in a standalone section on the certified public accounting professional exam.

The course included two midterm exams and one comprehensive final exam. The exams contained true or false and multiple-choice questions, several cases, and a four-part company analysis project. The final exam was weighted. Approximately one half of the points were based on material already tested. The other half was based on two chapters that were not previously tested.

To assure the integrity of the testing process different testing tools were used in the previous semesters. In summer and fall 2014 and spring 2015 semesters, Respondus Monitor was used. In spring 2014 Respondus Lockdown was used. In summer and fall 2013 Blackboard Test tool was used. The settings for all exams were set up the same with randomized questions, displayed one at a time, and backtracking prohibited. The exams opened from Sunday through Wednesday with 80 minutes to complete the final exams and 55 minutes to complete each of the two midterm exams.

DATA COLLECTION AND ANALYSIS

The instructor collected the exam scores and final grades of the same course offered over four semesters with different testing tools used. The instructor also removed identifiable student information before data analysis. Eighty-seven students used Respondus Monitor, 32 students used Respondus LockDown Broswer, and 38 students used Blackboard Test Tools. One-way ANOVA was used to determine the grade difference between the three groups. Among the 87 students whose tests were administered in Monitor, 1 student did not take any of the tests; therefore, that data set was dropped from analysis.

RESULTS

Descriptive statistics were calculated on the mean scores of exams administered in three testing environments: nonproctored recorded online testing environment, nonproctored lockdown online testing environment, and nonproctored online testing environment. As shown in Tables 1-3, there is not much difference in the mean score between three different testing environments. However, the standard deviation scores of the final exam and the total points between the three testing environments are different and it is very large, as there are two students who did not take some of exams when the tests were given in Respondus Monitor.

One-way analysis of variance (ANOVA) was used to determine whether differences existed in the mean score of tests administrated in three testing environments. As shown in Tables 4-7, no statistical differences were detected across three testing environments in mean scores of the exams.

DISCUSSION AND IMPLICATION

Although there were fewer students with Ds and Fs when exams were administered in Respondus Monitor (4.62%) than in Lockdown Browser (9.12%) and Blackboard Test Tool (5.09%), there was a big difference between the three testing environments on the standard deviation of the final exam (Monitor, 25.42; Lockdown, 15.56; Blackboard, 17.88) and the total scores of the exams (Monitor, 71.19; Lockdown, 41.61; Blackboard, 56.33). The difference might indicate that the technology-based nonproctored testing tool via streaming audio and video, such as Monitor, could be the environment that discriminated the student population in this study.

The result was not statistically significant, but the standard deviations of exams administered in Monitor were bigger than exams administered in the other two environments. It is recommended that a technology-based non-proctored testing tool via streaming audio and video, such as Monitor, be used for high-stake exams if human proctored testing is not feasible to students due to a variety of constraints such as cost, time to travel, and schedule. However, other types of assessments and online testing can be used but need to be carefully designed and implemented in online courses.

The possibility of academic dishonesty can be better handled by proactively assisting our students develop legitimate strategies and providing resources and support throughout their academic journey. When designing student assessments, faculty can consider different types of assessments to evaluate student performance against the stated learning objectives. The assessments can include but are not limited to: online testing, online proctored testing, on-campus testing, remote proctored testing, and authentic assessment.

No matter what type of assessments faculty use, it is imperative to have an open, ongoing dialogue with students about academic dishonesty and the consequences of breaching the policy. Embedded activities regarding academic dishonesty scheduled for the first week of class, such as a pop quiz or an online discussion, may be an effective way to communicate the policy to your students.

This study was based on a course offered in different semesters with limited numbers of students and was not statistically significant. However, if human proctoring is not a feasible option for students, the findings indicated that technology-based proctored testing tool via streaming audio and video, such as Monitor, may be an effective tool to use to tackle the issues of academic integrity, especially for high-stake tests such as midterm and final exams. Further studies are necessary utilizing larger data sets.

REFERENCES

Bobak, R., Cassarino, C., & Finley, C. R. (2005). Three issues in distance learning, Distance Learning, 1(5).

Cochran, L. F., Troboy, L. K., & Cole, T. L. (2010). A test of integrity: Remote proctoring in an online class. Journal of Business Administration Online, 9(2).

LockDown Browser. (2015, March 15). Retrieved from https://www.respondus.com/products/lockdown-browser/

Prince, D. J., Fulton, R.A., & Garsombke, T. W. (2009). Comparisons of proctored versus non-proctored testing strategies in graduate distance education curriculum. Journal of College Teaching and Learning, 6(7), 51-62.

Respondus Monitor. (2015. March 15). Retrieved from https://www.respondus.com/products/monitor/

Simonson, M., Smaldio, S., Albright, M., & Zvacek, S. (2003). Teaching and learning at a distance (2nd ed.). Upper Saddle River, NJ: Prentice Hall.

Berhane Teclehaimanot

The University of Toledo

Jiyu You

University of Michigan

Diana R. Franz, Mingli Xiao, and Sue Ann Hochberg

University of Toledo

* Berhane Teclehaimanot, Professor of Educational Technology, Department of Curriculum & Instruction, Judith Herb College of Education, The University of Toledo, Mail Stop 924, Gillham Hall (GH) 2000JJ, Toledo, OH 43606. Phone: 419-530-7979. E-mail: berhane.teclehaimanot@utoledo.edu
TABLE 1 Mean and Standard Deviation of Exams Administered in Monitor

                    N   Minimum  Maximum  Mean     SD

Exam 1              86  50       100       83.49   12.009
Exam 2              86   0       100       84.28   15.459
Final exam          86   0       150      127.37   25.421
Total points        86  70.0     481.0    389.907  71.1879
Valid N (listwise)  86

TABLE 2 Mean and Standard Deviation of Exams Administered in Lock Down

                    N   Minimum  Maximum  Mean     SD

Exam 1              33   42       98       81.94   14.313
Exam 2              33   58      100       83.82   10.409
Final exam          33   86      148      132.18   15.559
Total points        33  336.5    479.5    418.424  41.6126
Valid N (listwise)  33

TABLE 3 Mean and Standard Deviation of Exams Administered in Blackboard

                    N   Minimum  Maximum  Mean     SD

Exam 1              39   58       96       80.56   10.789
Exam 2              39   52      100       84.00   11.211
Final exam          39   84      150      125.18   17.881
Total points        39  269.0    512.0    423.013  56.3328
Valid N (listwise)  39

TABLE 4 ANOVA Results for Exam 1 by Method of Testing Environments

                Sum of Squares  df   Mean Square  F      Sig.

Between groups   21.194          23  .921         1.375  .135
Within groups    89.825         134  .670
Total           111.019         157

TABLE 5 ANOVA Results for Exam 2 by Method of Testing Environments

                Sum of Squares  df   Mean Square  F      Sig.

Between groups   23.242          23  1.011        1.543  .067
Within groups    87.777         134   .655
Total           111.019         157

TABLE 6 ANOVA Results for Final Exam by Method of Testing Environments

                Sum of Squares  df   Mean Square  F      Sig.

Between groups   28.449          29  .981         1.521  .059
Within groups    82.570         128  .645
Total           111.019         157

TABLE 7 ANOVA Results for Total Points by Method of Testing Environments

                Sum of Squares  df   Mean Square  F      Sig.

Between groups   84.019         116  .724         1.100  .372
Within groups    27.000          41  .659
Total           111.019         157

TABLE 8 Percentages of Letter Grades

Environments/Grades  A      B      C      D & F

Monitor              34.88  36.04  24.42  4.62
Lockdown             33.33  36.36  21.21  9.09
Blackboard           25.64  33.33  35.90  5.12
COPYRIGHT 2018 Information Age Publishing, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Teclehaimanot, Berhane; You, Jiyu; Franz, Diana R.; Xiao, Mingli; Hochberg, Sue Ann
Publication:Quarterly Review of Distance Education
Date:Mar 22, 2018
Words:2260
Previous Article:ONLINE LEARNING DESIGN AND IMPLEMENTATION MODELS A Model Validation Study Using Expert Instructional Designers.
Next Article:CONFERENCE CALENDAR.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters