Enhancing critical thinking in online learning.
This study compared critical thinking in undergraduate students via case study learning under two methods: (a) individual student analysis and (b) computer-supported collaborative analysis. Casebased learning was used as an instructional strategy to engage and motivate undergraduate students enrolled in a course designed to increase academic success and retention. Case study learning increased critical thinking skills under both conditions.
Higher order reasoning skills are required to cognitively manage the increasingly complex ways we communicate, collaborate, and work with others (Halpern, 1995). "A literate person must not only excel in reading and writing text, but also must be able to listen and speak, and read and write fluently through text, images, motion video, charts and graphs, and hypertext across a wide range of media" (North Central Regional Educational Laboratory, 2003, 9). In addition, individuals must be able to manage a vast array of resources within complex network systems. "The sheer magnitude of human knowledge, world globalization, and the accelerating rate of change due to technology necessitates a shift in our children's education--from plateaus of knowing to continuous cycles of learning" (North Central Regional Educational Laboratory, 2004, Executive Summary, 7).
Finding an agreed upon definition of critical thinking is daunting. Critical thinking has become a "mystified concept" due to its abstract nature and lack of common understanding. "Ask twelve psychology faculty members to define the term critical thinking, and you may receive twelve overlapping but distinct definitions" (Halonen, 1995, p. 75). Nevertheless, reference is frequently made to Bloom's taxonomy of educational objectives. The cognitive levels above knowledge and comprehension--application, analysis, synthesis, and evaluation--are considered critical thinking (Bloom, 1956).
Anderson, Krathwohl, Airasian, Cruikshank, Mayer, Pintrich, Raths, & Wittrock (2001) revised Bloom's taxonomy into a two-dimensional framework: a Knowledge dimension and a Cognitive Process dimension. Within the Knowledge dimension, "a fourth, and new category, Metacognitive Knowledge [was created and] ... involves knowledge about cognition in general as well as awareness of a knowledge about one's own cognition." Within the Cognitive Process dimension, "three dimensions were renamed, the order of two was interchanged, and those category names retained were changed to verb form to fit the way they are used in objectives" (Krathwohl, 2002, p. 214). Due to the complex nature of critical thinking and difficulty in assessing it, few empirical studies investigating critical thinking development in undergraduate students exist (Pithers, 2000). The few that do are not promising in relation to higher education's success in promoting critical thinking. In a study assessing the critical thinking of 256 university students using a Critical Reasoning Test, Pithers & Soden (1999) found no significant between-group differences in critical thinking for graduate versus nongraduate students. The authors purport the absence of significance is due to a lack of clarity surrounding the construct of critical thinking and reliable methods to assess it, as well as a primary instructional focus on subject-matter content. Similar findings are supported within a Teaching of Psychology issue on critical thinking. "A majority of students still demonstrate characteristics that correspond to a concrete thinking level rather than use formal-reasoning principles that Piaget ascribed to adult thinkers" (de Sanchez, 1995, p. 72). Other studies also support the view that adults do not necessarily develop critical thinking as a natural part of development. Arons (1979) and Whimbey & Lochhead's (1986) studies (as cited in de Sanchez, 1995) found that students "have difficulty in defining and resolving problems, changing focus, considering alternatives, and defining strategies" (de Sanchez, 1995, p. 73). Deficiencies in thinking skills may be attributed to instruction emphasized by memorizing unrelated and disconnected bits of information. Students conditioned in this type of learning often build "weak, rigid, and stereotyped thinking schemata, which results in stagnation, routine and superficial intellectual designs, and low cognitive levels" (de Sanchez, 1995, p. 73).
There is a move in education from a view of learning as the ability of students to reproduce information to the ability of students to critically evaluate and synthesize knowledge within contextual and relevant learning environments (Gagnon & Collay, 2001). "Constructivist refers specifically to the assumption that humans develop by engaging in the personal and social construction of knowledge ... Thus, humans construct knowledge; we do not receive and internalize predigested concepts without simultaneously reacting to them and engaging them within our own mental maps and previous experience" (Schmuck, 2001, p. x). Constructivists emphasize the dynamic nature of learning, as students engage in authentic tasks situated within relevant contexts. The emphasis is "on learning rather than teaching, and on facilitative environments rather than instructional goals" (Collins, 1996, p. 347).
Case-based instruction is an "active-learning pedagogy designed for problem analysis and problem-solving, stressing a variety of viewpoints and potential outcomes" (Cranston-Gingrass, Raines, Paul, Epanchin & Roselli, 1996). Well-written cases motivate and engage students as they analyze relevant issues from multiple perspectives. Experiential learning through case study learning is "likely to foster students' learning on a higher-order level, such as their critical thinking ability and propensity for self-direction in learning" (Kreber, 2001, p. 217). Computer conferencing technologies allow students to "examine their joint assumptions and share mental models of thought" (Pellegrino, 1995, p. 12). The complexities of collaborative analysis are simplified by the technology provision of written transcripts of the dialogue, easing the cognitive load involved in referencing, searching, and updating the conversation. The written format also makes the students' tacit knowledge public. Faulty thinking, naive conceptions, and errors in understanding are likely to be found and corrected (Klemm, 2002). Asynchronous learning networks expand the time and space limitations of the classroom, allowing for student discourse outside of the classroom, virtually at any time. The written dialogue provides documentation of student participation in the forum, easing the assessment process (Kemery, 2000) and makes students' participation and contributions public, promoting pride of ownership (Klemm, 2002).
A nonequivalent (pretest and posttest) control-group research design (Campbell & Stanley, 1971) was used. The independent variable, the case study analysis method, had two treatment levels: (a) individual case study analysis, and (b) collaborative asynchronous computer mediated analysis. The treatment groups analyzed three case studies over a three-week period. The comparison group analyzed the case studies individually, and the experimental group analyzed the case studies collaboratively using asynchronous computer-mediated technology. Case study analyses were assigned as homework under both instructional methods. A Holistic Critical Thinking Scoring Rubric (Facione & Facione, 1996) was used to measure critical thinking--the dependent variable.
The participants of the study were 80 undergraduate students enrolled in a course designed to increase academic success and retention in college. Most were first semester freshmen, conditionally admitted to the university and required to take the course. Approximately one-half of the students analyzed the case studies individually, and one-half analyzed the cases collaboratively using an online discussion board.
Case studies related to academic self-regulation were used as prompts to stimulate higher order reasoning. The cases included real world issues faced by many undergraduate students, such as self-regulation of performance and motivation, time management, and use of deep learning strategies (Phye, 1997; Dembo, 2000).
Pretests were scored using the Holistic Critical Thinking Scoring Rubric (HCTSR). Graded pretests were returned to students for review, and large group discussion followed to identify the cognitive processes involved. Case analysis templates listing Knoop's (1984) problem-solving steps were given to students for review. The template was a paper copy of the web-based form students would be completing online in future case study homework assignments. The template included the following steps:
1. Identify the problem.
2. Determine the underlying causes and symptoms of the problem.
3. Identify any unstated assumptions you are making and determine whether they are justifiable.
4. Brainstorm and list several strategies for resolution of case.
5. Evaluate each alternative, and then choose and rank your top 3 strategies according to effectiveness.
6. List your top 3 recommendations and present a rationale for each.
The pretest and posttest measures for critical thinking were examined for normality by using skewness and kurtosis coefficients (z-tests of greater or less than 1.96) and the Shapiro-Wilks test where indicated. Using the Levene test for univariate homogeneity of variance across the treatment and control groups by the (dependent) variable produced an alpha of .05. Pretest and posttest scores did not violate the assumption of normality; therefore, parametric tests were used to compare the means of the two groups.
Will the depth of critical thinking significantly improve for students analyzing case studies collaboratively using asynchronous computer-mediated communication and within students analyzing case studies individually. In order to test question 1, paired samples t-tests and one-within repeated measure analyses were conducted across two measures: pretest and posttest (Maxwell & Delaney, 1990; Stevens, 1996; SPSS, 2003). Since the HCTSR was created as an ordinal scale, nonparametric tests were also conducted to compare the obtained results from the parametric tests. The Wilcoxon Matched-pairs Signed-rank Test was used for data analyses. In all, analyses of the results of the nonparametric and parametric analyses agreed. Significant gains in critical thinking were detected within both the treatment and comparison groups. The mean difference within pretest and posttest scores for the experimental group was -.528, p < .05, with an effect size of .736 standard deviation units. The mean difference between pretest and posttest scores for the comparison group was -.574, p < .05, with an effect size of .635 standard deviation units.
Will the depth of critical thinking be significantly higher in students analyzing case studies collaboratively using asynchronous computer-mediated communication than in students analyzing case studies individually. In order to test question 2, a one-way analysis of variance was conducted using the posttest scores (Maxwell & Delaney, 1990; Stevens, 1996; SPSS, 2003). Since the HCTSR was created as an ordinal scale, a nonparametric test was also conducted to compare the obtained results from the parametric tests. The Mann-Whitney U Test was used for data analyses. In all, analyses of the results of the nonparametric and parametric analyses agreed. No significant mean differences in critical thinking were detected between the treatment group (online collaborative discussion) and the comparison group (traditional individual assignment) as measured by the HCTSR.
Participants in this study were shown to improve their critical thinking skills through participation in online collaborative case study analysis as well as individual case study analysis. Both treatment groups improved their critical thinking from pretest to posttest as measured by the HCTSR. Several factors explain this gain. First, the students in both groups were required to review and analyze case studies related to self-regulation issues many college students face. These ill-structured problems were used to motivate the students to complete the task and initiate the analytical cognitive processes. Second, students were specifically instructed in the steps required for effective analyses (i.e., identification of the problem, relevant assumptions, resolutions, etc.) and were required to document the processes by completing an online template of the steps prior to each analysis. Third, students received timely feedback on their analytical reasoning. The individually written essays required of both groups were graded and returned. Students also examined the cases and analytical processes within a class discussion. Fourth, students analyzed a total of five cases from pretest to posttest. This would appear to provide adequate practice for the tasks.
It was hypothesized that the online collaborative case study analysis method would be more effective in increasing critical thinking scores than the individual case study analysis method. Participants in the online collaborative discussions had opportunities to view the issues from multiple students' perspectives rather than from one individual perspective. It is possible that the face-to-face classroom discussions following each case equalized the groups by providing the students not using the online collaborative format the multiple perspective component that was expected to be found only in the online collaborative method. If, in fact, the in-class discussions provided the same benefits to the participants not engaged in online learning, this might explain why differences were not found between the groups. This "compensatory equalization of treatments" (Gall, Borg, & Gall, 1996, p. 472) may have obscured the effects of the experimental treatment.
Other limitations of the study necessitate discussion as well. The sample was derived from a relatively small (N=80) and unique population of high-risk students conditionally admitted to the university. Additionally, the sample consisted of intact groups and neither random selection nor assignment was employed. Participants were assigned to the differential treatments due to their enrollment in a particular section of the course. A purely random selection of participants would enhance the study's generalizability, as well as increase the internal validity of the study. In addition, it may difficult to generalize the results to regularly admitted students or to other conditionally admitted students at other institutions. All of the students in this study were full-time, and the results may not apply to part-time students or other varied populations.
Another limitation of the study is the limited variability in the instrument used to assess participants' critical thinking. Although the HCTSR is a very practical and useful tool to assess students' demonstration of various levels of critical thinking, an instrument with greater psychometric sensitivity is needed to detect change in critical thinking over time. Little variability was built into the instrument itself, since it was created on a 1-4 scale. Having a more sensitive instrument would help to detect discernible differences more effectively. In addition to the instrument's limited variability, the 4.0 scale does not coincide well with the A-F academic scale. A 5.0 scale would be more congruent with traditional scoring and would allow for more variability in scores.
The purpose of this study was to determine whether or not online collaborative case study learning would produce higher scores on critical thinking skills more effectively than individual efforts. The findings of this study did not support this hypothesis. The scores of the two groups did not differ significantly (p < .05) one from the other. Each group did, however, demonstrate improvement in critical thinking scores from pretest to posttest. Improving the study with a larger sample would strengthen the study's generalizability and allow the experimental design to include a third case study learning strategy treatment--small group face-to-face case study analysis. Including this third learning condition would facilitate closer investigation of the distinctive relationships between individual learning, collaborative learning, and online learning. To further illuminate the effects of various case study learning strategies on students' critical thinking, a similar study may be conducted giving the students the choice of working on the case study learning assignments under their preferred method: individual, small group face-to-face, or small group online discussion. A longer treatment period would also strengthen the study's reliability. Conducting the study over a longer period of time may provide useful information about the temporal course of the acquisition of critical thinking in undergraduate students.
Anderson, L.W., (Ed.), Krathwohl, D.R. (Ed.), Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., & Wittrock, M.C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom's Taxonomy of Educational Objectives. New York: Longman.
Arons, A. B. (1979). Some thoughts on reasoning capacities implicitly expected of college students. In J. Lochhead & J. Clement (Eds.), Cognitive process instruction (pp. 209215). Philadelphia: Franklin Institute Press.
Bloom, B. S. (1956). Taxonomy of educational objectives: The classification of educational goals, by a committee of college and university examiners. Handbook I: Cognitive domain. Toronto; New York: Green Longman.
Campbell, D. T., & Stanley, J.C. (1971). Experimental and quasi-experimental designs for research. Chicago: Rand McNally.
Collins, A. (1996). Design issues for learning environments. In E. Vosniadou, R. Glaser, & H. Mandl (Eds.), International perspectives on the design of technology-supported learning environments (pp. 347-361). Mahwah, NJ: LEA.
Cranston-Gingrass, A., Raines, S., Paul, J., Epanchin B., & Roselli, H. (1996). Developing and using cases in partnership environments. Teacher Education and Special Education, 19, 158-168.
de Sanchez, M. A. (1995). Using critical-thinking principles as a guide to college-level instruction. Teaching of Psychology, 22(4), 72-74.
Dembo, M. H. (2000). Motivation and learning strategies for college success: A self-management approach. Mahwah, New Jersey: Lawrence Erlbaum Associates, Inc.
Facione, P. & Facione, N. (1994). Holistic critical thinking scoring rubric. California Academic Press. Available: http://www.insightassessment.com/HCTSR.html [2001, 7/29/03].
Gagnon, G., & Collay, M. (2001). Designing for learning: Six elements in constructivist classrooms. Thousand Oaks, CA: Corwin Press, Inc.
Gall, M. D., Borg, W. R., & Gall, J.P. (1996). Educational research: An introduction. White Plains, NY: Longman Publishers USA.
Greeno, J. G., Collins, A.M., & Resnick, L.B. (1996). Cognition and learning. In D. C. Berliner, R. (Ed.), Handbook of educational psychology (pp. 15-46). New York: Simon & Schuster Macmillan.
Halonen, J. (1995). Demystifying critical thinking. Teaching of Psychology, 22(1), 7581.
Halpern, D. F., & Nummedal, S.G. (1995). Closing thoughts about helping students improve how they think. Teaching of Psychology, 22(1), 82-83.
Kemery, E. R. (2000). Developing on-line collaboration. In A. Aggarwal (Ed.), Web-based learning and teaching technologies: Opportunities and challenges (pp. 372). Hershey, PA: Idea Group Publishing.
Klemm, W. R. (2002). FORUM for case study learning: Analyzing research reports in a computer conferencing environment. Available: http:www.cvm.tamu.edu/wklemm/ CaseStudy.ms/forum for case_study_learning.htm [2002, 7/28/02].
Knoop, R. (1984). Case studies in education. St. Catharines, Ontario: Praise Publishing.
Krathwohl, D.R. (2002). A revision of Bloom's taxonomy: An overview. Theory Into Practice, 41 (4), 212-218.
Kreber, C. (2001). Learning experientially through case studies? A conceptual analysis. Teaching in Higher Education, 6(2), 217-228.
Maxwell, S., & Delaney, H. (1990). Designing experiments and analyzing data. Pacific Groves, CA: Brooks/Cole.
North Central Regional Educational Laboratory. (2003). Twenty-first century skills. Available: http://www.ncrel.org/engauge/skills/indepth.htm [2003, 2/7/03].
North Central Regional Educational Laboratory. (2004). Twenty-first century skills: Executive summary. Available: http://www.ncrel.org/engauge/skills/exec.htm [2004, 2/1/04].
Pellegrino, J. W. (1995). Technology in support of critical thinking. Teaching of Psychology, 22(1), 11-12.
Phye, G. (1997). Handbook of academic learning: Construction of knowledge. San Diego: Academic Press, Inc.
Pithers, R., & Soden, R. (1999). Assessing vocational tutors' thinking skills. Journal of Vocational Education and Training, 51,23-37.
Pithers, R. (2000). Critical thinking in education: A review. Educational Research, 42(3), 237-249.
Schmuck, R. A. (2001). Foreword, Designing for learning: Six elements in constructivist classrooms. Thousand Oaks, CA: Corwin Press, Inc.
SPSS. (2003). SPSS advanced statistics (Version 12.0). [Computer software]. Chicago: Author.
Stevens, J. (1996). Applied multivariate statistics for the social sciences. Mahwah, NJ: Erlbaum.
Whimbey, A., & Lochhead, L. (1986). Problem solving and comprehension. Hillsdale, NJ: Lawrence Erlbaum Associates, Inc.
Kathryn S. Lee, Texas State University--San Marcos
Lee, Ph. D. is an Assistant Professor in Curriculum & Instruction in the College of Education.
|Printer friendly Cite/link Email Feedback|
|Author:||Lee, Kathryn S.|
|Publication:||Academic Exchange Quarterly|
|Date:||Dec 22, 2005|
|Previous Article:||Self-efficacy, attitude and science knowledge.|
|Next Article:||Student's perception of quality in online courses.|