Printer Friendly

Evaluating the impacts of destination ImagiNation on the creative problem solving skills of middle school students.

Questions surrounding the effectiveness of creativity enhancement programs on the students who participate in them have long been of interest to researchers and educators, including those who advocate incorporating them into programs for gifted students (Beghetto, 2008; Treffinger & Isakson, 2005). Not only is such interest especially strong in the era of accountability where the demand for evidence-based practices in education dominates the research agenda (Hunsaker, 2005; Scott, Leritz, & Mumford, 2004; Treffinger, Selby, & Crumel, in press), but it has also increased as the development of so-called 21st Century skills, or those skills associated with critical thinking, problem solving, collaboration, and creative innovation, have received considerable attention (Charyton & Merrill, 2009; Kaufman & Beghetto, 2010; Scott et al., 2004).

One creativity enhancement program purporting to develop 21st Century skills is Destination ImagiNation (DI). (The DI program is described below.) While DI and other creativity programs such as Odyssey of the Mind and Future Problems Solving Program International assert that participation develops creative and critical thinking skills, few studies examine the impact of creativity enhancement programs using multiple outcome measures. Instead, most research addressing creativity training programs focuses on a single outcome, typically divergent thinking (Hunsaker, 2005; Meador, Fishkin, & Hoover, 1999). The research described here arises out of an evaluation study of the DI program and its impacts on several outcomes including creative problem-solving, creative and critical thinking, and teamwork skills using multiple instruments.

REVIEW OF THE LITERATURE

Creativity Defined

Although definitions of the creativity construct abound, creativity has been defined generally as the ability to produce work that is both novel, high in quality, and appropriate for a given task or problem (Sternberg, Jarvin, Grigorenko, 2011). Creativity is often considered from four inter-connected components called the "four P's" of creativity (Amabile, 1996). The four components are: the person who creates; the cognitive or mental processes involved in generating ideas and solving problems; the creative press, or the environment in which creativity occurs; and, the product that results from the creative process. Researchers increasingly view these components to be dynamic, and to reflect an interaction among the creator's aptitude, the creative processes used, and the environments in which creativity occurs, all of which combine to yield a novel and useful product (Baer & Garrett, 2010; Beghetto, 2008; Plucker, Beghetto, & Dow, 2004). This multi-faceted definition of creativity is useful as it reflects the ultimate goal of creativity enhancement efforts, which is to increase the ability of students to produce original and useful ideas and solutions in a given context (Beghetto, 2008; Treffinger et al., in press).

When investigating the effectiveness of creativity enhancement programs, considerable research has been directed toward the cognitive processes involved in creativity. These processes include both divergent and critical thinking (Beghetto, 2008; Hunsaker, 2005; Scott et al., 2004).

The Creative Process of Divergent Thinking

The cognitive mechanisms involved in generating a wide array of original ideas or solutions, or divergent thinking, have long been shown to represent a key capacity in the creative process (Beghetto, 2008; Clapham, 2004; Lonergan, Scott, & Mumford, 2004). Divergent thinking encompasses a variety of sub-skills, including originality, fluency, flexibility, and elaboration. Fluency refers to the number of creative responses produced, flexibility to the category shifts in responses, originality to the uniqueness of responses, and elaboration to the number of details provided. Divergent thinking constitutes a broad focus and outcome goal of many creativity training programs including DI (Beghetto, 2008; Treffinger & Isakson, 2005).

The Creative Process of Critical Thinking

Creativity researchers consider critical thinking to be another important cognitive mechanism in the creative process (Beghetto, 2008; Scott et al., 2004). Critical thinking broadly refers to the analysis and evaluation of ideas in terms of their logical and empirical foundations (Beghetto, 2008; Ennis, Millman, & Tomko, 2005; Linn & Shore, 2008). While some researchers view critical thinking to be a component of convergent thinking, which results in the single best idea or most appropriate solution (Beghetto, 2008), others view it be a distinct feature of the creative process that facilitates both divergent and convergent processes (Baer, 2003; Runco, 2008). Most researchers agree, however, that divergent thinking is expansive and generates numerous ideas, whereas critical thinking is focused and is involved in identifying the single best idea or most appropriate solution (Haier & Jung, 2008; Linn & Shorr, 2008). As such, it is a "reasonable, reflective thinking that is focused on what to believe or do" (Ennis, 1987, p. 12).

Although researchers have made substantial progress in understanding the cognitive mechanisms involved in divergent thinking, less is known about those involved in critical thinking. Thus, many researchers advocate directing increased attention to critical thinking in creativity research including within the context of creativity enhancement programs (Dixon, Cassady, Cross, & Williams, 2005; Linn & Shore, 2008; Lonergan et al., 2004; Runco, 2008).

Creativity Enhancement Programs

A significant body of research shows that creative thinking processes can not only be measured, but also taught through creativity enhancement programs (Beghetto, 2008; Grohman, Wodniecka, & Klusak, 2006; Linn & Shore, 2008; Meador et al., 1999; Treffinger & Isakson, 2005). Research shows programs applying a model using a variety of techniques to develop both divergent and critical thinking skills produce the most positive improvement on outcome measures assessing these skills (Grohman et al.; Hunsaker, 2005; Meador et al.; Scott et al., 2004). Notably, while critical thinking tests have been used to measure outcomes of creativity training programs, they have been used to a lesser extent than divergent thinking tests (Hunsaker, 2005; Scott et al., 2004).

With educators and researchers increasingly emphasizing 21st Century skills, additional research on the impacts of creativity enhancement programs has been encouraged (Hunsaker, 2005; Linn & Shore, 2008; Treffinger & Isakson, 2005; Treffinger et al., in press). Currently, most research on creativity enhancement programs focuses on a single outcome, especially divergent thinking. However, as creativity is viewed to be a multidimensional and complex construct, researchers increasingly recommend studies using a variety of measures to more fully gauge program outcomes (Beghetto, 2008; Scott et al., 2004). Thus, in order for educators to evaluate the degree to which creativity enhancement programs build 21st Century skills, research investigating program impacts across multiple outcomes is warranted.

Destination ImagiNation Program. DI is a nonprofit international creativity enhancement program that reaches over 35,000 students in Kindergarten through university level each year. On its website, DI asserts the program will provide "the opportunity to learn the basic skills of the 21st Century" including creativity, teamwork, and problem-solving, and it articulates three principle visions for doing so (www.idodi.org). The first is to "provide positive and fun environments for participants to explore and develop" creativity; the second is to "encourage teams to work productively" and present opportunities to "unleash individual strengths and styles to work collaboratively toward a common goal;" and, the third is to "provide opportunities for participants to learn and experience problem solving skills, tools and methods" (www.idodi.org).

Although DI supports several programs, its primary one is a team-based program in which two to seven members join to solve two types of "Challenges," the Team Challenge (TC) and Instant Challenge (IC). The TC is multidisciplinary and open-ended. Teams have several months to choose, create, and produce a solution to one of five TC offerings. TCs focus principally on a particular discipline such as Theater Arts, Improvisation, Construction and Innovation, or Structural or Architectural Design. However, they also incorporate opportunities to demonstrate creative problem solving skills in other disciplines through a variety of required TC components. DI teams present their TC solution at regional, state, and national tournaments where trained appraisers judge the team's creativity, problem solving, critical thinking, and team building skills (www.idodi.org).

The IC, on the other hand, is unknown to teams before the tournaments where teams receive an IC and the materials with which to solve it in a period of five to eight minutes. Thus, ICs require "impromptu, rapid-fire critical thinking" through the use of teamwork and problem solving skills (www.idodi.org). ICs can be task-based (e.g., construct a bridge as long as possible using prescribed materials), performance-based (e.g., create a skit that features 5 different animals), or a combination of the two (www.idodi.org). Typically, a team that selects a Theater Arts TC can expect a more technical or structural IC. Thus, teams demonstrate multiple skills across a variety of disciplines and contexts through both Challenges (www.idodi.org).

DI has its roots in the Odyssey of the Mind program. DI was started in 1999 by several individuals formerly associated with Odyssey of the Mind following a corporate dispute involving non-profit status. While they are separate programs, DI and Odyssey of the Mind are similar in programatic structure, and both use features of the Creative Problem Solving Method.

Creative Problem Solving Method. The DI program incorporates features of the Creative Problem-Solving (CPS) method developed by Sidney Parnes and his colleagues (www.idodi.org). CPS is a problem-solving strategy applied to open-ended, ill-defined problems requiring a solution like those offered in the DI program, and it is designed to provide a flexible and natural problem solving framework. CPS is based on the assumption that anyone can be taught to apply skills associated with creativity through deliberate training (Treffinger & Isakson, 2005). CPS presents three broad operations including problem understanding, idea generation, and action planning that call for both critical and divergent processes (Scott et al., 2004). Using CPS, DI participants must understand the Challenge posed, brainstorm multiple ideas and solutions (divergent thinking), evaluate and analyze the chance of success for solutions developed (critical thinking), and finally select the most effective and appropriate solution to the Challenge at hand (Treffinger & Isakson, 2005).

PURPOSE OF THE STUDY

The degree to which DI accomplishes its goals of developing 21st Century skills is the focus of this research. In connection with an independent program evaluation, multiple measures of creativity and teamwork skills intended to be developed through participation in the DI program were used. Multiple measures were included pursuant to recommendations in the literature to investigate a variety of program outcomes. First, researchers collected survey data relating to the degree to which primary stakeholders in the DI program felt satisfied that the program accomplished its stated goals. Primary stakeholders included middle school students participating in DI, parents of students participating in DI, Team Managers for DI teams, and Regional and State Directors. Second, data from multiple creativity instruments were collected to compare creative and critical thinking, problems solving, and teamwork skills in students who had participated in DI to matched students who had not participated in DI.

The research questions addressed in the study were: 1) To what degree do the primary stakeholders in DI report that participation in DI improves skills associated with creative problem solving, critical thinking, and teamwork? 2) How do skills in the areas of creativity, problem solving, critical thinking, and teamwork in the middle school students who participate in the DI program compare to those skills in students who do not participate in DI?

METHODS

Instruments

Data were derived from multiple instruments including; surveys of primary stakeholders in DI; the Torrance Tests of Creative Thinking, Verbal (TTCT; Torrance, 1990); the Cornell Critical Thinking Test, Level X (CCTT; Ennis et al., 2005); and, a creative problem solving task called Monkey in Motion (Jarvis, 2009). The sampling strategies for each instrument are described separately.

Surveys. Questionnaires distributed through SurveyMonkey were used to measure primary stakeholder beliefs about the impacts of DI on the students who participated in the program. Questionnaires probed multiple program areas (DI participant and team background, Creativity, Problem Solving, Teamwork and Leadership, Fun, Adult Interaction, Tournaments, Areas of Program Strength, Areas for Program Improvement). A variety of quantitative and qualitative question formats were used. For purposes of this paper, only those survey questions addressing program impacts in the areas of creativity, problem solving, critical thinking, and teamwork are discussed. A 5-point Likurt scale ranging from "Not at all - 1" to "A great deal - 5" was used to measure stakeholders' beliefs about program impacts in these areas.

To reflect the multi-dimensional characteristics of the creative processes and skills measured, the questions probed stakeholder beliefs across a variety of relevant construct components. Gubbins' (1986) Matrix of Thinking Skills, which integrates cognitive skills associated with divergent thinking, critical thinking, and problem solving relevant to DI, informed the survey development. For example, a question related to creativity was, "On a scale from 'Not at all' to 'A great deal', how much did DI teach you (or your child, or your team members) about CREATIVE THINKING in the areas stated below?" These areas included, among others, "generating many ideas," generating unusual or original ideas," "making unusual or original products," and "brainstorming ideas." Similarly, the question related to problem solving was, "On a scale from 'Not at all' to 'A great deal', how much did DI teach you (or your child, or your team members) about PROBLEM SOLVING in the areas stated below?" These areas included, among others, "Evaluating problems," "Analyzing many parts of a problem," and "Choosing the best alternatives in solving problems." Finally, the question related to teamwork was, "On a scale from 'Not at all' to 'A great deal', how much did DI teach you (or your child, or your team members) about TEAMWORK in the areas stated below?" These areas included, among others, "Collaborating with others," "Helping others to highlight their unique skills and talents," and "Resolving conflicts with others in a positive way." Four team managers, two state directors, and eight middle school DI participants volunteered to participate in a pilot test of the surveys to ensure that they would be understandable to survey participants, and minor revisions were made based on the comments from those taking the pilot surveys. Each stakeholder survey was designed to take approximately 20 minutes to complete.

TTCT, Verbal. Researchers have developed standardized tests to quantify the creative process, primarily through the use of divergent thinking tests (Plucker & Renzulli, 1999). The most widely used and researched divergent thinking tests are the TTCT (Torrance, 1990). The TTCT, Verbal contains 7 open-ended, timed writing activities in which test takers are asked to use their imagination and think creatively with words in response to visual and verbal prompts. The TTCT, Verbal produces three subscale scores: Fluency, Flexibility, and Originality (Torrance, 2008). The mean of the subscale scores is the overall creativity score. The TTCT, Verbal is reported to be a valid and reliable predictor of creative abilities, with reliability estimates ranging from 0.89 to 0.91 and strong criterion-related validity evidence (Clapham, 2004; Plucker, 1999). The TTCT, Verbal frequently has been used in research investigating the impacts of creativity enhancement programs (Clapham, 2004; Furnham, Crump, Batey, & Chamorro-Premuzic, 2009; Kim, 2008; Papworth et al., 2008; Plucker & Renzulli, 1999). Content and construct validity have been established by the TTCT developer, and intra-rater reliability coefficients are above the .90 level (Torrance, 1990). Longitudinal studies, including a 40-year follow-up, demonstrate the TTCT's long-term validity for predicting creative achievement (Cramond, Matthews-Morgan, Bandalos, & Zuo, 2005; Plucker, 1999).

CCTT, Level X. Several researchers have developed assessments to measure critical thinking abilities (Linn & Shorr, 2008; Watson & Glaser, 1980). One critical thinking test used in primary and secondary settings is the CCTT (Ennis, Millman, & Tomko, 2005). Adopting Ennis's definition of critical thinking, the CCTT measures a composite of skills associated with critical thinking, including induction, deduction, evaluation, credibility (of statements made by others) assessment, and assumption identification (Ennis et al). The CCTT, Level X is a 71 item multiple-choice test. Each item has three response choices, one of which is correct. The number of correct answers produces a raw score. The test allows 50 minutes for completion and can be administered in a group or individual setting. Raw scores were used in the analysis. The CCTT is reported to be a valid and reliable measure of critical thinking (Ennis et al.; Daud & Husin, 2004; Gunn, Grigg, & Pomahac, 2006) with Spearman-Brown reliability coefficients ranging from 0.67 to 0.90. Criterion- and content-related validity correlations with other critical thinking measures and intelligence measures cluster around .5 (Ennis et al.).

Monkey in Motion Creative Problem Solving Task. Monkey in Motion (MiM) is a creative problem solving task (Jarvis, 2009) which was adapted for this study to be used as a team creative problem solving assessment. MiM requires students in teams of two to seven members to use a set (or subset) of materials provided to develop a creative and unique solution for a small plastic monkey to be self-propelled along the length of an eight foot string. Examples of the materials provided include balloons, magnets, springs, clay, pipe cleaners, safety pins, and straws. Multiple solutions to the problem exist. Teams had 10 minutes to develop a creative solution to the problem posed, the opportunity to describe their solution, and a five minute period to attempt their solution and propel the monkey down a string. Directions for administration, list of materials, and scoring can be found in Jarvis (2009). The researchers selected and adapted this task based on its similarity to Instant Challenges used in the DI program and its incorporation of a team-based creative problem solving assessment into the evaluation.

Two doctoral students in Educational Psychology at the University of Virginia, both of whom had taken doctoral level coursework in creativity, rated observed or videotaped team solutions to the MiM task in 5 skills categories: creativity (Cr), problem solving (PS), critical thinking (CT), teamwork (Tw), and distance (Di) along the string the monkey self-propelled. Scoring for MiM was blind in that the DI status of the participants was unknown to the raters. Each category was rated on a 5-point Likert scale, ranging from (1) very poor application of (creativity, problem solving, critical thinking, teamwork) skills to (5) very strong application of (creativity, problem solving, critical thinking, teamwork) skills. For the distance propelled category, the 5-point Likert scale ranged from (1) 0%-20% along the line, to (5) 81% -100% along the line, with each point reflecting a 20% incremental increase in distance the monkey self-propelled. These five categories also produced a Total Score, which added the scores from the five category scores for a possible total of 25 points.

Participant Samples

Survey Participants. Researchers used a combination of full population sampling and purposeful and snowball sampling strategies to survey DI stakeholders. Specifically, researchers e-mailed surveys to all Team Managers from the 2009-2010 DI season using a database maintained by DI (full population sampling). As DI did not maintain a database of DI participants or of parents, Team Managers also received an e-mail requesting them to forward e-mail addresses of the parents of their team members to the researchers for purposes of surveying parents (snowball sampling). Using the email addresses forwarded by Team Managers, researchers e-mailed surveys to DI parents and simultaneously asked these parents to forward a survey to their middle school child(ren) who participated in DI during the 2008-2009 or 2009-2010 school year. Researchers targeted middle school students as they are the group with the highest representation in DI. Five hundred eight-eight Team Managers, 824 parents, and 347 middle and high school students ultimately participated in the survey. All survey participants were informed that their participation was voluntary and would have no impact on future involvement in DI tournaments and activities, and that they would remain anonymous to the researchers and to DI. Although DI participants who were middle school students during the 2008-2009 and 2009-2010 school year were targeted, additional demographic information (gender, race, socioeconomic status, etc.) was not obtained. Surveys were collected in the spring of 2010.

Divergent Thinking and Critical Thinking Test Participants. TTCT and CCTT participants (both DI and non-DI) included middle school students (grades 6-8) born between the years 1995-1999 from California, Illinois, Texas and Virginia. Although the original study design included a stratified random sampling strategy to ensure a broad representative sample of randomly selected DI participants, parents, and team mangers, it became evident that the lack of a centralized system from which to draw study participants precluded the implementation of this sampling strategy. This methodological limitation is discussed in the Limitations section. Consequently, participants in these states were selected principally because their state DI Directors were employed by a public school system and received permission to test students in their schools. Specifically, participants from California included students enrolled in an advanced language arts program; participants from Texas included homogeneously grouped middle school students in social studies classes; participants from Illinois included homogeneously grouped middle school students in a technology/library sciences class; and, participants from Virginia included homogeneously grouped middle school students. This sampling strategy was intended to allow the researchers to obtain a participant sample estimated to include comparable DI and non-DI participants in terms of academic ability, age, race, gender, and socioeconomic backgrounds. However, additional specific demographic information for the sample such as gender, race, and socio-economic status was not obtained which could verify comparability, a limitation discussed below.

Tests were administered to participants from California, Texas, and Illinois by their language arts, social studies, and media teachers respectively during class time. In Virginia, one of the researchers administered tests to participants in small groups (five to eight) at their public school after school hours. As an incentive for participation, students from both the DI and non-DI groups were entered into a lottery for $50 Best Buy Gift Cards. Ten students who were administered the TTCT and 10 students who were administered the CCTT were randomly chosen as recipients of the gift cards.

Two hundred fifty-one students (DI N = 113; non-DI N=138) completed the TTCT, Verbal. Two hundred nineteen middle school students (DI N = 102; non-DI N = 117) completed the CCTT, Level X. Tests were administered during the fall of 2010.

MiM. MiM participants included 105 students (DI N = 59; non-DI N = 46). The 105 students made up 23 teams comprised of 2 to 7 team members. Fifteen teams were from Virginia (8 DI teams; 8 non-DI teams), of which approximately half attended a summer enrichment program for gifted students operated by the University of Virginia. Others were recruited through email solicitations from the Virginia DI Director seeking voluntary participation in the DI evaluation. One team consisting of five DI participants was recruited by the Illinois DI Director. Seven teams (4 DI teams and 3 non-DI teams) consisting of 45 homogeneously grouped students were recruited at the middle school where the Texas DI Director was employed. MiM was administered in the summer and fall of 2010 at a place and time of the team members' choosing after school and on weekends.

RESULTS

DI Stakeholder Surveys

Survey results show broad program satisfaction across all primary stakeholders. As summarized in Table 1, majorities of each group of primary stakeholders report the DI program teaches students a "great deal" about the skills associated with creativity, problem solving, critical thinking, and team work. These stakeholders report that Central Challenges and Instant Challenges at Tournaments are most helpful to the development of these skills. In the open-ended qualitative component of the surveys, words frequently used to describe the DI program include "creative," "fun," "challenging," "rewarding," "fulfilling," "exhausting," and "awesome."

TTCT, CCTT, and MiM

The two groups (DI and non-DI) were compared on each of the dependent variables via independent sample t tests. As summarized in Table 2, DI participants outperformed non-DI participants on all measures utilized in the program evaluation including the TTCT, the CCTT, and MiM (except TTCT fluency and originality scores which were not statistically significantly higher).

TTCT, Verbal. As reflected in Table 2, a comparison of means on the TTCT, Verbal indicated statistically significant (p < .05) higher mean scores for students who participated in the DI program than non-DI students on the Battery Average (DI M = 112.19; Non-DI M = 106.6) and Flexibility scores (DI M = 107.79; Non-DI M = 101.28). Although students who participated in the DI program had higher mean scores than non-DI students on Fluency (DI M = 110.06; Non-DI M = 104.7) and Originality (DI M = 118.81; Non-DI M = 113.89), these differences were not statistically significant.

CCTT, Level X. As reflected in Table 2, a comparison of means on the CCTT indicated statistically significant (p < .05) higher mean scores for DI participants than non-DI participants (DI M = 47.02; Non-DI M = 42.85).

MiM. As reflected in Table 2, a comparison of means on the MiM creative problem solving task showed statistically significant (p < .05) higher mean scores for DI participants as compared to non-DI participants on total score (DI M = 18.76; Non-DI M = 15.02), creativity (DI M = 3.64; Non-DI M = 3.04), problem solving (DI M = 3.8; Non-DI M = 2.74), critical thinking (DI M = 4.08; Non-DI M = 3.04) distance (DI M = 3.32; Non-DI M = 2.54) and teamwork (DI M = 3.86; Non-DI M = 3.30).

DISCUSSION AND LIMITATIONS

The purpose of this study was to examine DI program impacts and stakeholder satisfaction in areas relating to creative problem-solving, creative and critical thinking, and teamwork. By using multiple instruments measuring outcomes across several areas sought to be developed by DI (divergent thinking, critical thinking, problem solving, teamwork, participant satisfaction), this study sought to deepen the literature base by moving beyond assessing a single component of creativity in the evaluation of a creativity enhancement program.

Before moving to a broader discussion of the results and directions for future research, several notable limitations in study design should be acknowledged. First, this study did not use an experimental or quasi-experimental design. Thus, participants were neither randomly selected nor randomly assigned to DI or non-DI groups. Indeed, locating participants for all components of the study proved difficult in early stages with most teams and DI participants who were contacted opting out of participation in the study. The difficulty of locating participants was addressed by locating participants in school districts where DI Directors were able to obtain cooperation from their school administrators for allowing multiple students to participate in the study during school hours. This difficulty has implications for future research on creativity enhancement programs where obtaining both randomly selected and large sample sizes would offer stronger conclusions about results. Another limitation which arises from the sampling strategy is the possibility that students who participate in DI and programs like it are more skilled in problem solving, divergent thinking, teamwork, and critical thinking to begin with, and are therefore attracted to programs like DI. Thus, although while the researchers attempted to include experimental and control participants who had similar academic abilities and backgrounds within each site to address such limitations in sampling strategy, the possibility that these students are different to begin with is a limitation. Another notable limitation in study design is that no pre-test/post-test comparison was made which could ensure comparability of skills before participation in DI, and thereby demonstrate an improvement in creative and problem solving skills following participation in DI. Although prior research has shown such pre-test/post-test improvements in the skills measured in this study (Hunsaker, 2005: Fishkin et al., 1999), pre-test/post-test comparisons and additional experimental controls would have offered added confidence to conclusions drawn about the impacts of DI program participation. Due to these notable limitations, it is unclear whether these findings would be replicated in other settings with other populations.

Notwithstanding the limitations addressed above, some general conclusions merit discussion. Again, this study sought to evaluate a creativity enhancement program across a variety of program outcomes. As such, this study represents an initial effort to study one such program, DI, in a way that recognizes the multidimensionality of the creativity construct.

As to specific findings, nearly all primary stakeholders in the DI program reported great satisfaction with program outcomes across all areas of interest. Moreover, the majority of stakeholders strongly believe that the DI program contributes to the development of skills associated with creativity, problem solving, and critical thinking in the students who participate in DI activities and tournaments. These findings are consistent with other research showing high "participant satisfaction" with creativity enhancement programs (Frasier, Winstead, & Lee, 1997; Treffinger, Selby, & Crumel, in press), and they offer additional support for the proposition that participants in DI, as well as parents of participants and team managers of participants, perceive that the program contributes to the development of 21st Century skills.

Results from the TTCT, CCTT, and MiM creative problem solving task add some evidence to support these stakeholder beliefs. Across the majority of study findings, students who participated in the activities and tournaments provided by DI outperformed comparable non-DI students on assessments measuring creative thinking, critical thinking, creative problem solving, and teamwork. The only exceptions were the Fluency and Originality subscales of the TTCT, where scores of DI participants were higher but these differences were not statistically significant. Because these group differences were seen across a variety of measured outcomes, they buttress claims made by DI that participation in its program is not only enjoyable, but develops skills associated with divergent thinking, critical thinking, problem solving, and teamwork. Because they offer some evidence for the effectiveness of creativity enhancement programs across a variety of outcomes, this may generate support in school districts to fund such programs. However, while potentially encouraging to advocates of creativity enhancement programs for students, in light of the above-noted limitations in study design, the conclusion that these differences are attributable to participation in the DI program must be approached cautiously.

It is also possible that alternative explanations account for some of these results. For example, although more complex, MiM had similar structural features to some technical Instant Challenges. Thus, experience with similar challenges may have provided "processing heuristics" and relevant strategies that advantaged DI participants over non-DI participants on this measure (Scott et al., 2004, p. 377). On the other hand, not all, or even most, Instant Challenges integrate features similar to the MiM and MiM did not have any theatrical, improvisational, or fine arts features typically found in Instant Challenges. Similarly, some may suggest that participation in DI and programs like it may provide similar training or heuristics that prepare them to take the TTCT or CCTT. However, as the DI program is not a program that provides systematic or focused training, but rather requires students to be self-directed throughout the development of their Challenge solutions, it does not seem likely that students were provided with relevant processing heuristics that advantaged them over non-DI participants on the TTCT or CCTT.

RECOMMENDATIONS FOR FURTHER RESEARCH

In support of continued research and evaluation of creativity enhancement program impact in areas relating to creative problem-solving, critical thinking, and teamwork, several areas for additional research are recommended. Further research should endeavor to utilize experimental or quasi-experimental designs with students assigned to DI or non-DI groups. Recognizing that such assignment might be difficult given the nature of the DI program, both pre-test and post-test comparison groups across a variety of assessments should be utilized in future research. If statistically significant improvements are seen after participation in creative enhancement programs in the areas described above as compared to students who did not participate, researchers could make a stronger argument that those differences are attributable to program participation.

A wider variety of assessments including additional measures of the creative process as well as measures of other known correlates of creativity such as personality, intelligence, and motivation is also encouraged. This would help educators and researchers better understand whether students who participate in DI and programs like it are different from those who do not based on a variety of factors. Future researchers should also obtain data from program participants indicating the number of years of participation in DI or similar programs, and data indicating the length of time that has passed since participation in the creativity enhancement program. From this data, researchers could compare performance on the assessments to determine if increased length of time in the program further enhances performance on outcome measures, and also obtain data about possible transfer of skills long after program participation (Hunsaker, 2005).

Finally, it is recommended that additional research be undertaken on the relationship between divergent thinking and critical thinking test scores as there continues to be a debate on the nature of relationship between the two sets of abilities (Cropley, 2006; Grohman, Wodniecka, & Klusak, 2006). Some researchers have suggested that divergent and critical thinking are independent, even "polar opposites" and that a move towards one results in a move away from the other (Nickerson, 1999), while others conclude that they are essentially the same (Lubart, 2000-2001). This empirical question would clearly be relevant in the context of creativity enhancement programs that teach both critical and divergent thinking. Other researchers suggest that divergent and critical thinking are distinct capacities, but are also interdependent for creative production in a way that is similar to the relationship between creativity and convergent thinking as measured by tests of intelligence (Runco, 2003; Scott et al., 2004; Sternberg et al., 2011). This relationship, referred to as the threshold theory, shows that a correlation appears to exist between intelligence and high levels of creativity up to an IQ of 120, but above an IQ of 120 the correlation is weak (Sternberg et al.). Thus, examining the possibility of the existence of a similar threshold effect in the relationship between divergent and critical thinking processes may also help to clarify the cumulative evidence in the field related to specific features of the creative process, namely divergent, convergent, and critical thinking.

Tracy C. Missett, Carolyn M. Callahan and Holly Hertberg-Davis University of Virginia, USA

Correspondence concerning this article should be addressed to Tracy Missett, National Research Center on the Gifted and Talented, University of Virginia, 1912 Thomson Road, Charlottesville, VA, 22904, USA. E-mail: tcm2cx@virginia.edu

Author Note. Tracy C. Missett, Department of Curriculum, Instruction, and Special Education, University of Virginia; Carolyn M. Callahan, Department of Curriculum, Instruction, and Special Education, University of Virginia; Holly Hetrberg-Davis, Department of Curriculum, Instruction, and Special Education, University of Virginia.

REFERENCES

Baer, J. (2003). Evaluative thinking. In M. A. Runco (Ed.), Critical creative processes (pp.). Creskill, NJ: Hampton Press.

Baer, J., & Garrett, T. (2010). Teaching for creativity in an era of content standards and accountability. In R. A. Beghetto & J. C. Kaufman (Eds.), Nurturing creativity in the classroom (pp. 6-23). New York: Cambridge University Press.

Beghetto, R. A. (2008). Creativity enhancement. In J. A. Plucker & C. M. Callahan (Eds.), Critical issues and practices in gifted education: What the research says (pp. 139-153). Waco, TX: Prufrock Press.

Charyton, C., & Merrill, J. A. (2009). Assessing general creativity and creative engineering design in first year engineering students. Journal of Engineering Education, 145-156.

Clapham, M. M. (2004). The convergent validity of the Torrance Tests of Creative Thinking and creativity interest inventories. Educational and Psychological Measurement, 64, 828-841.

Cropley, A. (2006). In praise of convergent thinking. Creativity Research Journal, 18, 391-404.

Daud, N. M., & Husin, Z. (2004). Developing critical thinking skills in computer-aided extended reading classes. British Journal of Educational Technology, 35, 477-487. Destination ImagiNation (2012). Retrieved from www.idodi.org

Dixon, F., Cassady, J., Cross, T., & Williams, D. (2005). Effects of technology on critical thinking and essay writing among gifted adolescents. Journal of Secondary Gifted Education, 16, 180-189.

Dollinger, S. J., Urban, K. K., & James, T.A. (2004). Creativity and openness: Further validation of two creative product measures. Creativity Research Journal, 16, 35-47.

Ennis, R. H. (1987). A taxonomy of critical thinking dispositions and abilities. In J. Baron & R. J. Sternberg (Eds.), Teaching thinking skills: Theory and practice (pp. 9-26). New York: Freeman.

Frasier, M. M., Winstead, S., Lee, J. (1997). Is the Future Problem Solving program accomplishing its goals? Journal of Secondary Gifted Education, 8, 157-163.

Furnham, A., Crump, J., Batey, M., & Chamorro-Premuzic, T. (2009). Personality and ability predictors of the "Consequences" test of divergent thinking in a large non-student sample. Personality and Individual Differences, 46, 536-540.

Grohman, M., Wodniecka, Z., & Klusak, M. (2006). Divergent thinking and evaluation skills: Do they always go together? Journal of Creative Behavior, 40, 125-145.

Gubbins, E. J. (1986). Gubbins's Matrix of Thinking Skills. In R. J. Sternberg (Ed.) Critical thinking; its nature, measurement and improvement (Washington, DC: National Institute of Education).

Haier, R. J., & Jung, R. E. (2008). Brain imaging studies of intelligence and creativity: What is the picture for education? Roeper Review, 30, 171-180.

Hunsaker, S. L. (2005). Outcomes of creativity training programs. Gifted Child Quarterly, 49, 292-299.

Jarvis, J. (2009). The relationship between adolescents' domain knowledge and creative performance on an ill-defined physics task (Doctoral dissertation). Retrieved from ProQuest Dissertations and Theses. (3351204).

Kaufman, J. C., & Beghetto, R. A. (2010). Creativity in the classroom coda: Twenty key points and other insights. In J. C. Kaufman & R. A. Beghetto (Eds.), Nurturing creativity in the classroom (pp. 415-418). New York: Cambridge University Press.

Kim, K. H. (2008). Meta-analyses of the relationship of creative achievement to both IQ and divergent thinking test scores. Journal of Creative Behavior, 42, 106-130.

Linn, B., & Shore, B. M. (2008). Critical thinking. In J. A. Plucker & C. M. Callahan (Eds.), Critical issues and practices in gifted education: What the research says (pp. 155-165). Waco, TX: Prufrock Press.

Lonergan, D. C., Scott, G. M., & Mumford, M. D. (2004). Evaluative aspects of creative thought: Effects of appraisal and revision standards. Creativity Research Journal, 16, 231-246.

Meador, K. S., Fishkin, A. S., & Hoover, M. (1999). Research-based strategies and programs to facilitate creativity. In A. S. Fishkin, B. Cramond, & P. OlszewskiKubilius (Eds.), Investigating creativity in youth: Research and methods (pp. 389-415). Cresskill, NJ: Hampton.

Nickerson, R. S. (1999). Enhancing creativity. In R. J. Sternberg (Ed.), Handbook of Creativity (pp. 392-431). Cambridge: Cambridge University Press.

Papworth, M. A., Jordan, G., Backhouse, C., Evans, N., Kent-Lemon, N., Morris, J., & Winchester, K. J. G. (2008). Artists' vulnerability to psychopathology: Towards an integrated cognitive perspective. Journal of Creative Behavior, 42, 149-163.

Plucker, J. A., & Renzulli, J. S. (1999). Psychometric approaches to the study of human creativity. In R. J. Sternberg (Ed.), Handbook of Creativity (pp. 35-61). New York: Cambridge University Press.

Runco, M. A. (2008). Commentary: Divergent thinking is not synonymous with creativity. Psychology of Aesthetics, Creativity, and the Arts, 2, 93-96.

Saxon, J. A., Treffinger, D. J., Young, G. C., & Wittig, C. V. (2003). Camp invention: A creative, inquiry-based summer enrichment program for elementary students. Journal of Creative Behavior, 37, 64-74.

Scott, G., Leritz, L. E., & Mumford, M. D. (2004). The effectiveness of creativity training: A quantitative review. Creativity Research Journal, 16, 361-388. Sternberg, R. J., Jarvin, L., & Grigorenko, E. L. (2011). Explorations in giftedness. New York: Cambridge University Press.

Sternberg, R. J., & Lubart, T. I. (1999). The concept of creativity: Prospects and paradigms. In R. J. Sternberg (Ed.), Handbook of Creativity (pp. 3-15). New York: Cambridge University Press.

Torrance, E. P. (1999) Torrance Tests of Creative Thinking: Norms and technical manual. Beaconville, IL: Scholastic Testing Service.

Torrance, E. P. (2008). Torrance Tests of Creative Thinking, Verbal Form A. Bensenville, IL: Scholastic Testing Services.

Treffinger, D. J., & Isaksen, S. G. (2005). Creative problem solving: The history, development, and implications for gifted education and talent development. Gifted Child Quarterly, 49, 342-352.

Treffinger, D. J., Selby, E. C., & Crumel, J. H. (in press). Evaluation of the Future Problem Solving Program International. International Journal of Creativity and Problem Solving.

Vincent, A. S., Decker, B. P., & Mumford, M. D. (2002). Divergent thinking, intelligence, and expertise: A test of alternative models. Creativity Research Journal, 14, 163-178.

Watson, G. & Glaser, E. M. (1980). Watson-Glaser Critical Thinking Appraisal. New York: Psychological Corp.
Table 1
DI Stakeholders Average Survey Response on Scale of 1 (Not at all)
to 5 (A great deal)

                                         DI         DI        Team
Survey Question                       Student     Parent    Manager

How much did DI teach about
CREATIVE THINKING in the
areas stated below?
  Generating many ideas                 4.54       4.50       4.47

  Generating unusual or                 4.64       4.49       4.46
  original ideas

  Making unusual or original            4.50       4.37       4.25
  products

  Finding new ways                      4.60       4.39       4.32
  to use materials

  Brainstorming                         4.48       4.55       4.51

  Thinking creatively                   4.56       4.45       4.43
  even when conditions
  become difficult or stressful

How much did DI teach about PROBLEM
SOLVING in the areas stated below?

  Having to figure out                  4.40       4.36       4.38
  what the problem is

  Comparing and contrasting             4.42       4.37       4.26
  possible solutions to
  a problem

  Making decisions about how            n/a        4.42       4.43
  to best solve a problem

  Recognizing potential obstacles       4.47       4.27       4.20
  or roadblocks to
  solving problems

  Paying attention to many details      4.49       4.28       4.22

  Focusing your (your child's/          4.41       4.31       4.11
  your team's) thinking

  Examining alternative solutions       4.61       4.29       4.21
  in terms of resources, materials,
  costs and time limits

  Performing quickly and                4.63       4.53       4.52
  under pressure

  Solving problems even when            4.59       4.45       4.45
  conditions become difficult

How much did DI teach about
TEAMWORK in the areas
stated below?
  Collaborating with others             4.73       4.76       4.71

  Helping others reach                  4.44       4.36       4.21
  their goals

  Letting others help you
  reach your goals                      4.30       4.32       4.15

  Cooperating with others               4.69       4.67       4.62

  Showing appreciation for              4.63       4.50       4.50
  the skills and talents
  of others

  Helping others highlight their        4.42       4.35       4.33
  unique talents and strengths

  Resolving conflicts with others
  in a positive way                     4.49       4.50       4.31

Table 2
DI versus Non-DI Group Mean Comparisons

                 Mean          Mean Score
               Score DI          non-DI
Measure       Group (SD)       Group (SD)       t     df      p

TTCT, BA    112.19 (20.97)    106.6 (19.72)   2.169   249   0.031 *
TTCT, Fle   107.79 (20.08)   101.28 (18.78)   2.65    249   0.009 **
TTCT, Flu   110.06 (23.57)    104.7 (23.34)   1.84    249   0.066
TTCT, Or    118.81 (21.68)   113.89 (19.82)   1.87    249   0.062
CCTT         47.02 (8.39)     42.85 (7.54)    3.92    217   0.000 **
MiM, TS      18.76 (4.57)     15.02 (6.52)    3.46    103   0.001 **
MiM, Cr       3.64 (1.24)      3.04 (1.41)    2.31    103    .023 *
MiM, CT       4.08 (.77)       3.04 (1.56)    4.47    103   0.000 **
MiM, PS       3.80 (0.91)      2.74 (1.2)     5.15    103   0.000 **
MiM, Tw       3.86 (1.27)      3.30 (1.36)    2.17    103   0.032 *
MiM, Di       3.32 (1.15)      2.54 (1.59)    2.91    103   0.004 **

Note, TTCT = Torrance Test of Creative Thinking, Verbal, (DI N = 113;
non-DI N = 138); CCTT = Cornell Critical Thinking Test, Level X
(DI N = 102; non-DI N = 117); MiM = Monkey in Motion Creative
Performance Test (DI N = 59; non-DI N = 46).

* p < .05, ** p < .01
COPYRIGHT 2013 Korean Association for Thinking Development
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2013 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Missett, Tracy C.; Callahan, Carolyn M.; Hertberg-Davis, Holly
Publication:The International Journal of Creativity and Problem Solving
Date:Oct 1, 2013
Words:7128
Previous Article:College students' general creativity as a predictor of cognitive risk tolerance.
Next Article:Researching and analysing young children's thinking competence through informal conversations and cognitive process mapping.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters