Printer Friendly

Researching and analysing young children's thinking competence through informal conversations and cognitive process mapping.

INTRODUCTION

The importance of and the need for teaching thinking skills have been well articulated in research and literature (Barak & Dori, 2009; Fisher, 2003; Hipkins, 2006; McGuinness, 1999; Murdoch & Wilson, 2008; Ong & Borich, 2006). However, assessment of thinking skills in the classroom has been a much neglected area (Ennis, 1993; Ku, 2009; Stiggins, Griswold, & Wikelund, 1989). This is due in part to the fact that students' subject knowledge is relatively easier to assess than their thinking skills (Bissell & Lemons, 2006).

Testing is the most established and efficient method for assessing thinking skills (Asp, 2001; Ku, 2009; ten Dam & Volman, 2004) but can potentially pose problems such as the lack of comprehensiveness, artificial time constraint imposed by the duration of the test, risk of familiarity if pre-test and post-test are used, and failure to take into consideration factors such as differences in beliefs, culture, and assumptions (Ennis, 1993). Furthermore, written forms of assessment may not be suitable for young children because they require a certain level of literacy skills. Performance assessments can offer useful insights into thinking processes adopted by learners because they are process-oriented (Costa and Kallick, 2001) and allow learners to demonstrate their knowledge and abilities through performance of a task (Khattri, Reeve, & Kane, 1998; Moon & Callahan, 2001). Performance assessments using observations and interviews are considered by some to be more suitable forms of assessment techniques for young children (Gullo, 2006; McAfee, Leong, & Bodrova, 2004). Performance assessments, especially those that promote complex problem-solving and thinking, can be messy, involve uncertainty and complexity, and present difficulties for observation, interpretation, and assessment (Schmidt & Plue, 2000). A common approach to interpretation in performance assessment involves utilization of a set of criteria or rubrics constructed using a framework or taxonomy (Borich, 2006; Costa & Kallick, 2001; Marzano & Kendall, 2007). Rubrics are constructed before learning takes place and pose the risk of channelling assessors to look for predetermined features of task performance and ignore other aspects not foreseen or included in the rubrics (Hipkins, 2009).

Cognitive task analysis (CTA) offers the possibility of an alternative approach or additional method to assessing cognitive operations and thinking skills, particularly in open-ended tasks such as problem-solving and decision-making. CTA has been developed and employed since the 1970s as a means to study and explain the complex thinking and task performances of experts across a variety of domains--medicine, aircraft piloting, air traffic control, electronic troubleshooting, industrial process control, and military command, among many others (Hoffman and Militello; 2008). According to Siegler and Alibali (2004), task analysis can lead to insights about children's thinking and problem-solving, in particular what they are doing in situations where they solve problems efficiently, where they might have difficulty, and what the source of difficulty might be in situations in which they cannot solve problems efficiently. The purpose of this paper is to demonstrate how CTA might be used to support research and teaching practice aimed at understanding and developing children's thinking competence. More specifically, this paper discusses how CTA can be used to analyse and assess cognitive processes1, domain knowledge2, and thinking skills employed by young children in open-ended and ill-structured tasks. The discussion in this paper is drawn from the results of a qualitative study aimed at exploring 7 to 8 year-olds' thinking competence as they engaged in open-ended and ill-structured tasks in a classroom setting.

In sum, current methods for assessing thinking skills comprise predominantly testing and criteria-based performance assessments but they tend to focus on predetermined aspects of task performance. CTA offers an alternative approach that overcomes this issue because it does not rely on the use of predetermined criteria or checklists. However, CTA has never been employed to analyse and assess young children's thinking competence and task performances in the classroom. This paper is a direct response to this challenge.

LITERATURE REVIEW

What is CTA?

The concept of CTA first emerged in the late 1970s in response to a need: How can we analyze the cognitive components of work? How can we help learners think and perform more like experts? Hoffman and Militello (2009) define CTA as "the determination of the cognitive skills, strategies and the knowledge required to perform tasks" (p.55). CTA is aimed at helping researchers explore cognitive processes in depth and apply this knowledge to the design of tools, technologies, work systems, and training (Hoffman & Militello, 2009). CTA techniques entail the use of qualitative and quantitative methods to seek out information about knowledge, thought processes, decision-making, and problem-solving and information processing strategies underlying observable task performance in real-world contexts (Hoffman & Militello, 2009; Wei & Salvendy, 2004). The outcome of CTA can be a description of the cognitive skills involved in task performance (Wei & Salvendy, 2004).

CTA has evolved and expanded in terms of scope and techniques since the 1970s through the work of instructional designers, expertise researchers, systems engineers, cognitive psychologists, and others. Despite its many variations, CTA may be described as involving three main components of knowledge elicitation, knowledge analysis, and knowledge representation (Crandall, Klein, & Hoffman, 2006; Lee & Reigeluth, 2003). CTA uses various forms of knowledge representation such as narratives, data or graphic organizers, flow charts, and process diagrams to display data, present findings, and communicate meaning (Crandall, et al., 2006). A review of CTA literature reveals that the term "knowledge" was used variously to include domain knowledge, procedures, problem-solving strategies, skills, thought processes, decisions, guidelines, and mental models (Clark, Feldon, van Merrienboer, Yates, & Early, 2007; Cooke, 1994; Lee & Reigeluth, 2003; Rowe & Cooke, 1995; Wei & Salvendy, 2004).

CTA Applications

CTA is commonly used to analyse real-world problem-solving and task performance (Cooke, 1994). CTA has been used to analyse and identify knowledge, skills or mental models in expert performance in order to inform instructional design (Clark, et al., 2007; Jonassen & Hernandez-Serrano, 2002; Lajoie, 2003), as well as to assess novice-to-expert progression in training (Carley & Palmquist, 1992; Lajoie, 2003). According to Hoffman and Militello (2009), the driving hypothesis of CTA studies is that effective task performance is highly organised and highly coherent, and CTA of expert performance often shows that experts work in a systematic fashion. The assessment of development of expertise in CTA is facilitated by comparing learner's performance to that of experts (Carley & Palmquist, 1992; Chipman, Schraagen, & Shalin, 2000; Lajoie, 2003) or theoretical models (Mumford, Friedrich, Caughron, & Antes, 2009). Such comparisons focus on selected key processes the mastery of which is necessary for competent performance (Crandall, et al., 2006; Mumford, et al., 2009).

CTA Techniques

Diverse techniques in CTA tap different knowledge (Lee & Reigeluth, 2003; Wei & Salvendy, 2004) and a combination of methods is generally recommended because a single technique is unlikely to be adequate (Cooke, 1994; Crandall, et al., 2006). Observations of task performance are used to discover how experts make judgments or decisions, perform diagnosis, and carry out complex tasks (Cooke, 1994; Wei & Salvendy, 2004). Interviews and questionnaires are used to engage experts in conversations aimed at revealing their thinking and processes undertaken in making judgments, solving a problem, or deciding a course of action (Dehoney, 1995; Hoffman & Militello, 2009; Wei & Salvendy, 2004). The use of probe questions or interruption analysis involves interrupting the expert during observation of task performance in order to clarify what is observed or to solicit insights that cannot be obtained through silent observation (Cooke, 1994; Crandall, et al., 2006). Process tracing typically involves asking the expert either to self-report or to think aloud while performing the task and protocol analysis is carried out with the transcripts in order to draw inferences on what the expert sees and the thinking processes undertaken (Hoffman & Militello, 2009; Wei & Salvendy, 2004). Observations, interviews, process tracing, and protocol analysis are well-suited for analysis of tasks that are skill-based (Hoffman & Militello, 2009). In particular, observations and interviews are useful where specific task performance is not well-defined (Hoffman & Militello, 2009). Observations may also be supplemented with probe questions or retrospective review of video-tapes in order to increase information yield (Chipman, et al., 2000). Observation involving video-recording and subsequent coding of data is a technique well-suited for process-tracing in CTA (Chipman, et al., 2000).

METHODS

Observations using interviews in the form of informal conversations with the children and video-recording were the techniques adopted in this study since the purpose was to elicit information on cognitive processes, domain knowledge, and thinking skills employed by the children in their task performance. The analysis in this research was guided in part by CTA practices such as "knowledge" (data) representation and assessment of the development of expertise by comparing learner's performance to theoretical models. Data representation to facilitate analysis in the study involved the use of narratives and the development of a tool called a cognitive process map.

The Participants and Research Setting

The research site for the study was a state-run primary classroom in New Zealand comprising 28 children aged seven to eight years old and their teacher. The school draws students from an upper-middle class urban community in New Zealand. The participants comprised 18 girls and 10 boys, all of European origin except for one Chinese, one Malay, two Indians, and one Maori. The school was chosen as a research site for several reasons. It was exploring key competencies set out in the New Zealand school curriculum, which included thinking, using language, managing self, relating to others, and participating and contributing (Ministry of Education, 2007). The teacher had a keen interest in promoting children's thinking skills and had previously carried out research in this area as part of her Master's degree. The teacher's approach to develop thinking skills in the classroom included inquiry-based activities lasting typically between two to three hours in the morning3. Students either worked individually or cooperatively in groups of twos on topics assigned by the teacher. Problem-solving was a key feature of these projects: for example, the healthy lunch menu project discussed in this study entailed students identifying required resources, analyzing information, creating different types of healthy lunch menus, and documenting their work. Children were expected to apply their visual, oral communication, reading, writing and mathematical skills. The tasks that the children faced were open-ended and ill-structured, and the children had to employ their own approaches and strategies. The children's projects did not follow any given structure, phase-by-phase process, or well-defined pathway even though a certain degree of support was provided by the teacher. The teacher's support included supplying materials and resources, providing access to the Internet, responding to children's questions, and suggesting of areas to explore.

The case examples discussed in this paper were drawn from 25 observed cases of children's task performances (16 individual cases and 9 cases of children working in pairs) in the context of their inquiry-based activities. The following sections discuss the procedures for data collection, representation and analysis, which find their parallels in CTA in the form of knowledge elicitation, representation and analysis.

Data Collection

Data were gathered primarily through observations of the children that incorporate informal conversations while the children were engaged in their activities or conversations about the work that they had just completed in the natural classroom setting. These informal conversations focused on eliciting children's thinking. In researching with children, Gollop (2000) suggests that it is more helpful to adopt the approach of having a conversation when interviewing children. According to Smith, Duncan, and Marshall (2005), informal conversations, interviews and discussions with children are useful methods for eliciting children's voice and seeking their own perspectives about their learning. Some researchers recommend interviewing children informally when they are actively engaged in activities and familiar routines (Cooney, 1993; Einarsdottir, Dockett, & Perry, 2009; Parkinson, 2001). Follow-up individual interviews were conducted to verify the observational findings or obtain information that could not be gathered through observations.

The questions posed to the children were either semi-structured or unstructured conversational type where questions arose from the situation. The semi-structured format employed open-ended questions such as "How do you know that ... ?", "Why do you think that ... ?", "Can you tell me about or can you explain ... ?", "Can you tell me more about that?", "Why, how, or when do you do that?", and "Why do you think that happened?" A flexible structure was adopted to allow for certain points of interest to be clarified or further pursued, and to allow for fresh insights and new information to emerge. Respondents were free to answer questions in their own words and were able to answer either briefly or at length.

The researcher conducted a total of 16 visits to the school over a period of three months to collect data. The visits were made either once or twice a week, with each visit lasting up to half a day. The observations and conversations, lasting between 10 to 30 minutes each, were video-taped for later transcription. Data gathering generated approximately 25 hours of video recording.

Data Representation

Data gathered through the informal conversations in this study were represented and interpreted using narrative vignettes and cognitive process maps. A cognitive process map was constructed by using a tree-like logic structure to create an approximated framework of the complex thought processes nested in the data. In its simplest form, a cognitive process map was constructed by linking two sequential comments or observed behaviours with an aspect of thinking that had been inferred from the participant's observed behaviours and the context within which the observed activity took place. The following simple example illustrates how cognitive process mapping was applied to the child Jack's comments when he made an observation during a class activity involving a whole-class discussion on pumpkins and the making of pumpkin soup. Jack first observed that pumpkins were big and were of different colours, and then inferred that they would taste different:

The observed comments or actions were represented as paraphrases in labelled boxes. A differentiation between a child's comment or observed behaviour and that of a mediating adult was made by using a shaded box for the latter in order to highlight any adult intervention. The inferred thinking skills involved were represented as labelled lines and arrows were used to show the sequence of events or logic. Three types of structure were used: linear, divergent, and convergent.

Cognitive process maps offered a means to summarise lengthy transcripts of data in order to focus the analysis and facilitate interpretation. The following example involves an informal conversation between the researcher and Richard in relation to his individual class project on the teacher-assigned topic of healthy lunch menu. Richard had drawn four lunch boxes on an A4-sized paper, each with drawings and written descriptions of different types of food such as "popcorn", "wedges", and "carrots" in compartments. Prices of "$9", "$11.50", "$8.60", and "$10.30" were written next to each lunch box respectively.

Researcher: What makes you think popcorn is healthy?

Richard: Well, it says on here (points to a poster on types of healthy food)--on this--and it says popcorn there.

Researcher: Oh. Okay, so you checked.

Richard: Yeah (laughs).

Researcher: And wedges? Are they healthy?

Richard: Potato is but it's probably processed--like put other flavours onto it, there will be salt--probably not.

A cognitive process map of the informal conversation (Figure 3) succinctly shows the kind of domain knowledge and thinking skills involved. It shows that Richard was able to break down the issue of whether they were healthy in order to carry out a systematic analysis, evaluate whether his inclusion of wedges in one of his healthy lunch boxes was warranted, and arrive at the conclusion that wedges were probably not healthy. Richard had to rely on his knowledge of how wedges were made to carry out the analysis.

Data Analysis

Analysis of data in the study was undertaken using an interpretive approach (Hatch, 2002). Initial descriptive coding (Miles & Huberman, 1994) of the transcripts was undertaken to discover the types of thinking skills exercised by the children. To facilitate the process of analysis, cognitive process maps were constructed to aggregate and display data. The children's cognitive processes, thinking skills, and domain knowledge were examined. The close reading and analysis of the data led to identification of impressions or initial themes which were then recorded in memos. The memos were then studied for salient interpretations and the data were reread and coded in places where the interpretations were supported or challenged. The objective of this second level of coding was to identify patterns or themes in the data (Miles & Huberman, 1994). A draft summary of the interpretations was reviewed with the teacher so that the interpretations could be checked against the memos, reflective journals, and transcribed data. A summary of the process of analysis is provided in the Appendix.

Strategies to address the reliability of the study involved multiple repeated observations and conversations with the children in relation to their work, individual child interviews, reflective dialogues with the teacher, and maintenance of an audit trail comprising records of field notes, transcripts, and memo-writing, (Creswell, 2007; Merriam, 1998). Strategies employed to enhance the interpretive validity of the study included prolonged engagement and persistent observation in the field, search for discrepant evidence, stakeholder checking with the teacher, memo writing to chart the steps taken in the interpretation of data, and clarification of assumptions that might be potential cause of bias (Creswell, 2007; Thomas, 2006).

CASE EXAMPLES OF ANALYSIS USING CTA TECHNIQUES

In order to demonstrate how CTA can be used to analyse children's thinking competence in their task performances, the task analysis carried out in the study is presented and discussed using a number of case examples to illustrate how CTA techniques were used to analyse firstly the children's cognitive processes, and secondly their domain knowledge and thinking skills.

Analysis of Children's Performances in Open-ended, Ill-structured Tasks

Comparison of the children's task performances to relevant theoretical models of cognitive processes was carried out to analyse the cognitive processes employed by the children in the study. This included examination of the ways in which the children structured their approach or the lack there of.

Investigation, decision-making, and problem-solving are some of the commonly identified cognitive processes (Marzano, et al., 1988; Moseley, Elliott, Gregson, & Higgins, 2005). Various theoretical models have been proposed for these processes to explicate the steps involved or suggest how they could be taught. The discussion in this paper focuses on the processes of decision-making and investigation. Wales, Nardi, and Stager (1986) suggest a four-stage model of decision-making: 1) state goal; 2) generate ideas; 3) prepare a plan; and 4) take action. Ehrenberg, Ehrenberg, and Durfee (1979) propose a three-step decision-making model: 1) clarify requirements and anticipate the ideal characteristics that would meet all of the requirements; 2) identify, clarify, and verify characteristics of each alternative against the ideal, and select the closest alternative; and 3) verify choice against the requirements. The adaptation and modification of these models resulted in the following theoretical model for decision-making which was used to facilitate analysis in this study:

Kuhn's (2007) model of inquiry--identifying a question, investigating information sources for answers, analysing and interpreting evidence, and drawing conclusions was adapted to develop the theoretical model of investigation for this study:

While there are other alternatives, the theoretical models used for the discussion in this paper serve as illustrative examples of how the comparison of children's performance to a theoretical model can facilitate the analysis and assessment of children's cognitive processes. For reason of space, two case examples are selected to illustrate the process involved.

The first example involved Jill working on her food project. The incident occurred the day after the school had a parents' evening4 during which the parents had the opportunity to look at work that the children had done for their project on food 50 years ago. The teacher asked the children to think of ways to make their presentation more interesting following a comment by one of the children that some parents did not appear to be interested in their work. The researcher observed Jill as she worked on her food project seated at her desk with an A4-sized paper and a set of colour markers. The cognitive process map in Figure 6 summarises the informal conversation between Jill and the researcher when she was working on her food project. It shows that she constantly engaged in divergent thinking which generated many ideas but hindered her in moving forward in the search for a way to make her food project more interesting to parents.

Comparison to the theoretical model in Figure 4 shows that Jill had potentially two key components of the decision-making process. Firstly, she generated several alternatives that she could explore and evaluate for suitability, such as:

* Cook something since she thought that she was quite good at cooking.

* Make a big recipe book.

* Do something on pancakes from other parts of the world.

* Do something on cooking from countries that her parents might not have been to.

Secondly, there were two considerations that she could potentially use as criteria to evaluate her options and help her make a decision:

* "... parents feel very proud if they've learned something off their kids and they feel really proud of their kids because their kid's teaching them"

* "I don't actually think they know that in other countries that they haven't been to ... they cook different food ..."

Jill would clearly be in a better position to achieve her goal if she had adopted a more structured approach to organise her thoughts and ideas.

The second example involves Eve and Kim's internet search on their project topic of "the most popular celebration food 50 years ago". The cognitive process map in Figure 7 summarises the informal conversations conducted with the children while they were performing their search.

Analysis revealed that the process adopted by Eve and Kim had several components of the theoretical model of investigation in Figure 5. They defined the question: "What was the most popular celebration food 50 years ago?" They used the Internet as their source of information with Google as the search engine. Their investigative strategies were relatively unsophisticated: "you can choose anything" (random choice), "pick one of the main ones" (search results), "just look up and read everything", and choosing one to explore (browsing). However, it is unclear what were considered the "main ones" in terms of search results. They generated some information on ice-cream sandwich making machines and came to the conclusion that there were such machines 50 years ago. Eve and Kim did not evaluate their findings and conclusion against the original search question to determine their relevance. The conclusion drawn by Eve and Kim was logically weak and irrelevant to their original question. At the process level, they could have benefited from additional sources of information in order to triangulate their findings and conducted an additional step that involved a more thorough analysis and interpretation of evidence generated by their search. In terms of thinking skills, they appeared to need some support in developing their ability to employ more sophisticated investigative strategies, generate wider range of evidence, carry out analysis of their evidence, and make logical deduction to arrive at a sound conclusion.

In these case examples, the theoretical models served to guide the analysis and assessment of children's cognitive processes, and identify specific issues that impacted the children's effectiveness in decision-making and investigation.

Analysis of children's thinking skills and domain knowledge

The CTA approach employed in the study also facilitated the analysis of the children's domain knowledge and thinking skills in their task performances. Two illustrative examples are discussed.

The first example involves Sarah and her Google search for lunchboxes 50 years ago. Sarah showed the researcher a booklet that she and her friend created using Powerpoint slides that displayed what they had discovered in their project on lunchboxes 50 years ago. Sarah explained that they found some of the information on the Internet. In response to the researcher's request to show how she conducted her search, Sarah typed "lunch boxes fifty years ago" on Google search which generated results that appeared to be irrelevant. She then clicked on Google images and pictures of a few lunchboxes among other things appeared on the screen. These included pictures of lunchboxes which were clearly not from 50 years ago, such as a lunch box decorated with images of Pokemon.

Researcher: How do we know if these are lunchboxes from 50 years ago?

Sarah: Um because we googled 50 years ago, so it will come up 50 years ago. This one--this one--this one (points to three o f the pictures of lunchboxes one at a time, including the one with images of Pokemon on it).

Researcher: And they don't look like the current lunchboxes? What's the difference?

Sarah: Well, the current lunchboxes are more like--they've got a whole lot of compartments and things like that.

Researcher: And these ones don't have that?

Sarah: Ah no. Just one big one like this one because this one is 50 years ago (points to the old lunch box that was brought to class by the teacher as an example of lunch box from 50 years ago).

Researcher: But we don't know. We see only the outside. Can we look inside (points to a picture of a lunch box on the computer screen)?

Sarah: Usually comes up with information there, sometimes (clicks on the picture).

Researcher: Looks like this one doesn't have it. How do we tell if it's got one compartment or lots of compartments?

Sarah: Well, you can tell because the ones with compartments have slots on them.

Researcher: But we can't see inside, can we?

Sarah: No--(silence for few seconds then points to one of the pictures on the screen)--but that one is 50 years old because that's one of the ones that we picked up on the internet--

Analysis of Sarah's internet search was facilitated by comparison with the theoretical model in Figure 5. The analysis revealed that her search process was relatively well-structured: she was able to state the question, use at least two sources of information (the internet and the old lunch box sample provided by the teacher), and analyse the evidence she generated by examining one key characteristic of old lunchboxes that differed from current ones to arrive at her conclusion. Her investigative strategy involved first typing the search words "lunchboxes 50 years ago", followed by the use of Google images and selection of those images that showed pictures of lunchboxes.

Sarah's lack of knowledge of how the Google search engine works resulted in a false conclusion. She concluded simplistically that her search results were relevant to her search objectives: "we googled 50 years ago, so it will come up 50 years ago". She further justified her conclusion by stating that the lunchboxes in her search results had "just one big one (compartment) like this one (sample of old lunchbox provided by the teacher) because this one is 50 years ago". She ignored the fact that the pictures of the lunchboxes did not reveal the interior of the boxes even when it was pointed out to her. She failed to realise that she had no basis for comparing the inside of those lunchboxes to her knowledge that current lunchboxes had more compartments than the old ones. Students' prior knowledge and beliefs can influence how they interpret data, including the tendency to interpret new information according to their misconceptions rather than restructure their prior conceptions (Palincsar & Brown, 1988; Willingham, 2007). According to Kim and Hannafin (2011), in situations where learners lack adequate prior knowledge, they tend to form "... naive assumptions and theories situated in prior experiences and knowledge may limit or fail to adequately inform their inquiry processes" (p. 412) and develop oversimplified misconceptions that are highly resilient to change. Sarah was adamant that the search results were relevant, reiterating at the end of the conversation: "but that one is 50 years old because that's one of the ones that we picked up on the Internet ..."

The second example involves an informal conversation that the researcher had with two children Sue and Ellen in relation to their healthy lunch menu project. The cognitive process map in Figure 8 summarises the conversation.

Sue and Ellen were able to elaborate their criteria for what they considered to be healthy and justify their choice of food to include in their healthy lunch menu. They were able to apply their deductive thinking in concluding that "healthy food doesn't give off odour" using the premises "unhealthy food rots" and "food gives off odour when it rots". However, there were issues related to both critical thinking and domain knowledge. Their claim that one could tell whether food was healthy by merely looking at them or smelling them was tenuous. Their premises did not have direct links to the conclusion they were making in relation to healthy food. From the perspective of sound logical argument, their premises could only allow them to draw a conclusion about unhealthy food and not about healthy food (Twardy, 2004). The reasoning adopted by Sue and Ellen can also be analysed as follows. The first part comprised the pattern of modus ponens if we fill in the unstated conclusion "unhealthy food gives off odour":

Premise: Unhealthy food rots (p).

Warrant: Food gives off odour when it rots (if p then q).

Unstated or implicit conclusion: Unhealthy food gives off odour (q).

This was followed by what Bassham, Irwin, Nardone, and Wallace (2008) call denying the antecedent:

Unhealthy food gives off odour (If A then B)

Healthy food is not unhealthy (Not A).

Healthy food does not give off odour (therefore not B).

Denying the antecedent is clearly a logically unreliable pattern of reasoning (Bassham, et al., 2008). In addition, the conclusion that "healthy food doesn't give off odour" was in contradiction to the previous claim that one could tell if food was healthy by smelling it. There were also the misconceptions that "unhealthy food rots" and "healthy food does not give off odour".

The two examples reflected the very different types of challenges the children faced in terms of reasoning abilities and domain knowledge. The main challenge in the first case centred on the influence of prior knowledge and beliefs on the interpretation of data and the second involved logical reasoning concepts such as modus ponens and denying the antecedent. The examples also underscored the emergent nature of children's thinking performance in open-ended ill-structured tasks. The observed aspects of the children's performance were not predetermined or predictable at the beginning of the observations. The CTA approach adopted in the study offered the possibility of exploring the unanticipated aspects of children's performance without being necessarily influenced by a predetermined set of criteria or checklist.

DISCUSSIONS

The case examples demonstrate that the CTA approach employed in this study can facilitate the understanding and analysis of the cognitive processes, domain knowledge, and thinking skills used by children to perform tasks and solve problems. Engaging children in informal conversations drew the children's attention to and encouraged them to talk about aspects of their thinking. The informal conversations generated insights into children's thinking competence by engaging them in thinking as they talked about their work and the strategies they adopted. The information elicited was represented using appropriate vignettes of narratives and cognitive process maps. Cognitive process maps can be viewed as visual representations that support the assessment of the structure and effectiveness of a learner's cognitive process and the elements of thinking skills employed in a given task performance. Analysis of the narratives and cognitive process maps revealed the cognitive processes undertaken by the children in addition to the domain knowledge and thinking skills that they employed during task performance. The comparison of children's cognitive processes to theoretical models highlighted some of the challenges that the children face and the areas where they could improve their performance.

The CTA approach adopted in the study offered the possibility of exploring the aspects of children's performance including those that were unanticipated at the beginning of the observations. In this sense, it offered an open-ended and exploratory approach to understanding and assessing children's task performances without necessarily focusing on preconceived aspects of children's competence or using any predetermined checklists or criteria. It has potential application as a means of divergent assessment for formative purposes. Pryor and Crossouard (2005) argue that both convergent and divergent forms of assessment are necessary in classroom instruction. Convergent assessment involves finding out whether learners know, understand or can do a predetermined thing (such as the ability to apply a given thinking skill). Divergent assessment, on the other hand, involves a more open desire to know what learners know, understand or can do. It is more exploratory, more dialogic in nature, and guided by "helping questions" rather than "testing questions" (p.2). The CTA approach employed in the study can potentially provide formative information by helping the teacher to explore what the child is capable of and what areas the child may need scaffolding, and not to focus on the child's expected response or preconceived aspects of the child's performance.

The techniques used in this study are limited by the fact that they rely on learners' facility and ability to report thoughts underlying their task performances, and the researcher's efforts to ensure that accurate data are captured (Leighton & Gierl, 2007). The interpretation of performance also faces issues of potential subjectivity and multiplicity which pose a threat to reliability (Chipman, Schraagen, & Shalin, 2000, p.17). However, it has been argued that reliability is not an issue if the assessment is used for formative purposes to support student learning (Dochy & McDowell, 1997; Harlen, 2007). The adequacy of evidence gathered for formative purposes can be managed by careful gathering of information and seeking better evidence should the need arise to verify or rectify a judgment (Harlen, 2007).

In conclusion, the CTA approach adopted in this study offers researchers and educators an additional option to their repertoire of existing tools for research and assessment purposes. The techniques employed in the study could be useful for exploring and identifying aspects of children's thinking competence for subsequent targeted assessments such as tests or specifically designed performance tasks in order to further evaluate children's competence and learning needs. The potential and limitations of the data collection and analysis techniques employed in this study require further exploration, including their usefulness in supporting research into children and adult thinking in other settings, as well as the feasibility of these techniques as formative tools in the practical setting of the classroom.

Scott Lee

Australian Catholic University, Australia

Correspondence concerning this article should be addressed to Scott Lee, Australian Catholic University, Level 3, 174 Victoria Parade, East Melourne, VIC 3002, Australia. E-mail: wfleescott@gmail.com

REFERENCES

Asp, E. (2001). To think or not to think: Thinking as measured on state and national assessments. In A. L. Costa (Ed.), Developing minds: A resource book for teaching thinking (3rd ed., pp. 497-510). Alexandria, VA: Association for Supervision and Curriculum Development.

Barak, M., & Dori, Y. (2009). Enhancing higher order thinking skills among inservice science teachers via embedded assessment. Journal of Science Teacher Education, 20(5), 459-474.

Bassham, G., Irwin, W., Nardone, H., & Wallace, J. (2008). Critical thinking: A student's introduction (3rd ed.). Boston: McGraw-Hill Higher Education.

Bissell, A. N., & Lemons, P. P. (2006). A new method for assessing critical thinking in the classroom. BioScience, 56(1), 66-72.

Borich, D. G. (2006). Assessing thinking. In A.-C. Ong & D. G. Borich (Eds.), Teaching strategies that promote thinking: Models and curriculum approaches (pp. 284-302). Singapore: McGraw-Hill Education.

Carley, K., & Palmquist, M. (1992). Extracting, representing, and analyzing mental models. Social Forces, 70(3), 601-636.

Chipman, S. F., Schraagen, J. M., & Shalin, V. L. (2000). Introduction to cognitive task analysis. In J. M. Schraagen, S. F. Chipman, & V. L. Shalin (Eds.), Cognitive task analysis (pp. 3-23). Mahwah, NJ: Lawrence Erlbaum Associates.

Clark, R. E., Feldon, D., van Merrienboer, J., Yates, K., & Early, S. (2007). Cognitive task analysis. In J. M. Spector, M. D. Merrill, J. J. G. v. Merrienboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (3rd ed., pp. 577-593). Mahwah, NJ: Lawrence Erlbaum Associates.

Cooke, N. J. (1994). Varieties of knowledge elicitation techniques. International Journal of Human-Computer Studies, 41(6), 801-849.

Cooney, M. (1993, September). Making meaning from the child's perspective: Interviewing young children. Paper presented at the Northern Rocky Mountain Educational Research Annual Conference, Jackson, WY.

Costa, A. L., & Kallick, B. (2001). Building a system for assessing thinking. In A. L. Costa (Ed.), Developing minds: A resource book for teaching thinking (pp. 517527). Alexandria, VA: Association for Supervision and Curriculum Development.

Crandall, B., Klein, G., & Hoffman, R. R. (2006). Working minds: A practitioner's guide to cognitive task analysis. Cambdrige, MA: MIT Press.

Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among five approaches (2nd. ed.). Thousand Oaks, CA: Sage Publications.

Dehoney, J. (1995). Cognitive task analysis: Implications for the theory and practice of instructional design. Paper presented at the 1995 Annual National Convention of the Association for Educational Communication and Technology (AECT), Anaheim, CA.

Dochy, F. J. R. C., & McDowell, L. (1997). Introduction: Assessment as a tool for learning. Studies in Educational Evaluation, 23(4), 279-298.

Ehrenberg, S. D., Ehrenberg, L. M., & Durfee, D. (1979). BASICS: Teaching / learning strategies. Miami Beach, FL: Institute for Curriculum and Instruction.

Einarsdottir, J., Dockett, S., & Perry, B. (2009). Making meaning: Children's perspectives expressed through drawings. Early Child Development and Care, 179(2), 217-232.

Ennis, R. H. (1993). Critical thinking assessment. Theory into Practice, 32(3), 179186.

Fisher, R. (2003). Teaching thinking. London: Continuum.

Gollop, M. M. (2000). Interviewing children: A research perspective. In A. B. Smith, N. J. Taylor, & M. M. Gollop (Eds.), Children's voices: Research, policy and practice (pp. 1-17). North Shore City, New Zealand: Pearson Education.

Gullo, D. F. (2006). Assessment in kindergarten. In D. F. Gullo (Ed.), K today: Teaching and learning in the kindergarten year (pp. 139-147). Washington, DC: National Association for the Education of Young Children.

Harlen, W. (2007). Assessment of learning. London: Sage Publications.

Hatch, J. A. (2002). Doing qualitative research in education settings. Albany, NY: State University of New York Press.

Hipkins, R. (2006). The nature of the key competencies: A background paper. Wellington, New Zealand: New Zealand Council for Educational Research.

Hipkins, R. (2009). Determining meaning for key competencies via assessment practices. Assessment Matters, 1, 4-19.

Hoffman, R. R., & Militello, L. (Eds.). (2009). Perspectives on cognitive task analysis: Historical origins and modern communities of practice. New York: Psychology Press.

Jonassen, D. H., & Hernandez-Serrano, J. (2002). Case-based reasoning and instructional design: Using stories to support problem-solving. Educational Technology Research & Development, 50(2), 65-77.

Khattri, N., Reeve, A. L., & Kane, M. B. (1998). Principles and practices of performance assessment. Mahwah, NJ: Lawrence Erlbaum Associates.

Kim, M. C., & Hannafin, M. J. (2011). Scaffolding problem solving in technology-enhanced learning environments (TELEs): Bridging research and theory with practice. Computers & Education, 56(2), 403-417.

Ku, K. Y. L. (2009). Assessing students' critical thinking performance: Urging for measurements using multi-response format. Thinking Skills and Creativity, 4(1), 70-76.

Kuhn, D. (2007). Is direct instruction an answer to the right question? Educational Psychologist, 42(2), 109-113.

Lajoie, S. P. (2003). Transitions and trajectories for studies of expertise. Educational Researcher, 32(8), 21-25.

Lee, J.-Y., & Reigeluth, C. M. (2003). Formative research on the heuristic task analysis process. Educational Technology Research and Development, 51(4), 524.

Leighton, J. P., & Gierl, M. J. (2007). Verbal reports as data for cognitive diagnostic assessment. In J. P. Leighton & M. J. Gierl (Eds.), Cognitive diagnostic assessment for education (pp. 146-172). New York: Cambridge University Press.

Marzano, R. J., Brandt, R. S., Hughes, C. S., Jones, B. F., Presseisen, B. Z., Rankin, S. C., et al. (1988). Dimensions of thinking: A framework for curriculum and instruction. Alexandria, VA: Association for Supervision and Curriculum Development.

Marzano, R. J., & Kendall, J. S. (2007). The new taxonomy of educational objectives (2nd ed.). Thousand Oaks, CA: Corwin Press.

McAfee, O., Leong, D. J., & Bodrova, E. (2004). Basics of assessment: A primer for early childhood educators. Washington, DC: National Association for the Education of Young Children.

McGuinness, C. (1999). From thinking skills to thinking classrooms: A review and evaluation of approaches for developing pupils' thinking. London: Department for Education and Employment.

Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco: Jossey-Bass.

Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage Publications.

Ministry of Education. (2007). The New Zealand curriculum. Wellington, New Zealand: Learning Media.

Moon, T. R., & Callahan, C. M. (2001). Classroom performance assessment: What it should look like in a standards-based classroom? NASSP Bulletin, 85(622), 4858.

Moseley, D., Elliott, J., Gregson, M., & Higgins, S. (2005). Thinking skills frameworks for use in education and training. British Educational Research Journal, 31(3), 367-390.

Mumford, M. D., Friedrich, T. L., Caughron, J. J., & Antes, A. L. (2009). Leadership development and assessment: Describing and rethinking the state of the art. In K. A. Ericsson (Ed.), Development of professional expertise: Toward measurement of expert performance and design of optimal learning environments (pp. 84-107). New York: Cambridge University Press.

Murdoch, K., & Wilson, J. (2008). Helping your pupils to think for themselves. Milton Park, Abingdon, UK: Routledge.

Ong, A. C., & Borich, D. G. (Eds.). (2006). Teaching strategies that promote thinking: Models and curriculum approaches. Singapore: McGraw-Hill Education.

Palincsar, A. S., & Brown, A. L. (1988). Teaching and practicing thinking skills to promote comprehension in the context of group problem solving. Remedial and Special Education, 9(1), 53-59.

Parkinson, D. D. (2001). Securing trustworthy data from an interview situation with young children: Six integrated interview strategies. Child Study Journal, 31(3), 137-155.

Pryor, J., & Crossouard, B. (2005). A sociocultural theorization of formative assessment. Paper presented at the The Sociocultural Theory in Educational Research and Practice Conference, University of Manchester.

Rowe, A. L., & Cooke, N. J. (1995). Measuring mental models: Choosing the right tools for the job. Human Resource Development Quarterly, 6(3), 243-255.

Schmidt, M., & Plue, L. (2000). The new world of peformance-based assessment. Orbit, 30(4), 14-17.

Siegler, R. S., & Alibali, M. W. (2004). Children's thinking (4th ed.). Upper Saddle River, NJ: Prentice Hall.

Smith, A., Duncan, J., & Marshall, K. (2005). Children's perspectives on their learning: Exploring methods. Early Child Development & Care, 175(6), 473-487.

Stiggins, R. J., Griswold, M. M., & Wikelund, K. R. (1989). Measuring thinking skills through classroom assessment. Journal of Educational Measurement, 26 (3), 233-246.

ten Dam, G., & Volman, M. (2004). Critical thinking as a citizenship competence: teaching strategies. Learning and Instruction, 14(4), 359-379.

Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation, 27(2), 237-246.

Twardy, C. R. (2004). Argument maps improve critical thinking. Teaching Philosophy, 27(2), 95-116.

Wales, C. E., Nardi, A. H., & Stager, R. A. (1986). Decision-making: New paradigm for education. Educational Leadership, 43, 37-41.

Wei, J., & Salvendy, G. (2004). The cognitive task analysis methods for job and task design: Review and reappraisal. Behaviour & Information Technology, 23(4), 273-299.

Willingham, D. T. (2007). Critical thinking: Why is it so hard to teach? American Educator, 31(2), 8-19.

APPENDIX: PROCESS OF ANALYSIS

(1) Cognitive processes in this study refer to "macro" processes such as problemsolving and decision-making that may involve "micro" thinking skills such as making logical deductions and drawing conclusions.

(2) The term domain knowledge in this study refers to knowledge required to function or perform tasks within a given domain.

(3) School sessions commenced at 9 am and ended at 3 pm daily, five days a week. Subject-based lessons such as mathematics, literacy, and social science were conducted in the afternoon.

(4) Parents' evenings are meetings with parents held once every school term, four times per school year. The purpose is to encourage dialogue between the teacher and the parents, as well as to showcase some of the children's work to the parents.
COPYRIGHT 2013 Korean Association for Thinking Development
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2013 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Lee, Scott
Publication:The International Journal of Creativity and Problem Solving
Date:Oct 1, 2013
Words:7280
Previous Article:Evaluating the impacts of destination ImagiNation on the creative problem solving skills of middle school students.
Next Article:Backup plans and creative problem-solving: effects of causal, error, and resource processing.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters