Inquire biology: a textbook that answers questions.
The goal of the Inquire Biology textbook is to provide better learning experiences to students, especially those students who hesitate to ask questions. (2) We wish to create an engaging learning experience for students so that more students can succeed--and specifically to engage students in more actively processing the large number of concepts and relationships. Inquire Biology aims to achieve this by interactive features focused on the relationships among concepts, because the process of making sense of scientific concepts is strongly related to the process of understanding relationships among concepts (National Research Council 1999). To encourage students' engagement in active reading, our pedagogical approach is to help a student to articulate questions about relationships among concepts and to support them in finding the answers.
Inquire Biology incorporates multiple technologies from the field of artificial intelligence. It includes a formal knowledge representation of the content of the textbook, reasoning methods for answering questions, natural language processing to understand a user's questions, and natural language generation to produce answers. It is based on a systematic knowledge-acquisition process that educators can use to represent the textbook's knowledge in a way that the computer can reason with to answer and suggest questions. A unique aspect of our approach is the use of human-computer interaction design methods to create a product that combines these advanced AI technologies into a user experience that is compelling for students.
Inquire Biology is one of the products of Project Halo, (3) a long-term research effort to develop a capability to answer questions on a wide variety of science topics. Inquire Biology is based on Campbell Biology (Reece et al. 2011), which is a widely used textbook in advanced high school and introductory college courses. Campbell Biology is published by Pearson Education, and we are using it under a research-use license. Using a popular textbook as the basis for our research ensures that the content is pedagogically sound and the results have immediate applicability to a large group of students already using the textbook. The basic concepts in the design of Inquire Biology are, however, broadly applicable to other textbooks and scientific disciplines.
Exploration of the application of AI technology to an electronic textbook comes at an opportune time when we are witnessing a large-scale transition from paper to electronic textbooks. Textbooks made available in electronic medium offer tremendous opportunities to leverage a variety of AI techniques much broader than what we have considered in our work so far. Textbooks made available in electronic medium offer tremendous opportunities to leverage a variety of AI techniques much broader than what we have considered in our work so far, for example, recommending books (Pera and Ng 2012) and enhancing electronic textbooks with content from online resources such as images and videos (Agrawal et. al. 2011). Our focus on the use of knowledge representation leverages the synergy between the need for precise knowledge content in an electronic textbook and the strength of knowledge representation and reasoning to facilitate that.
We begin this article with a description of Inquire Biology, introducing its key features and describing how a student might use Inquire Biology for active reading and homework. We give an overview of the AI technology used in Inquire Biology and then present the results of a day-long pilot experiment showing that the students studying from Inquire Biology received approximately 10 percent higher grades on homework and posttest problems as compared to students studying from the paper or electronic version of the textbook. We discuss the generalizability and scalability of the approach and conclude the paper with a discussion on related work and directions for future work.
Features of Inquire Biology
Inquire Biology connects three kinds of information to meet student needs: the Inquire Biology textbook, concept summary pages, and computer-generated answers to questions.
The Campbell Biology textbook contains the original content of the textbook as published by Pearson Education. The textbook content is hyperlinked to the concept summary pages. These pages summarize salient facts and relationships that a student needs to know about a concept. Summary pages combine content from the glossary entries from the back of the book with automatically generated descriptions of salient properties and relationships for each of the 5000 plus concepts in our knowledge base. These pages also include links to related concepts, relevant passages from the book, and follow-up questions useful for further exploration.
A student can ask questions of Inquire Biology in three ways. First, a question may be typed into a freeform dialogue box, from which Inquire Biology computes the closest questions that it can answer and gives the user a choice for selecting the best match among them. Second, in response to the student highlighting in the text, Inquire Biology computes the most relevant questions it can answer for the selection. Finally, Inquire Biology suggests questions for further exploration on the concept summary pages and with each page that contains an answer. We next illustrate how students interact with these features of Inquire Biology.
Figure 1 shows the textbook interface of Inquire Biology. Like most electronic textbooks, Inquire Biology supports highlighting the text, taking notes in the margin, and interacting with graphics. Inquire Biology leverages its knowledge base to expand on these standard features in the following three ways: (1) The toolbar provides a table of contents, index to concept summaries, and navigation history. Students can ask a question at any time by tapping the Q icon. (2) Within the text, biology terms are automatically linked--students can tap to see a quick pop-up definition or to navigate to the full concept summary. (3) Students can highlight with a quick and easy gesture, and each highlight serves as the anchor for a note card and a list of related questions, encouraging students to dig deeper into the material.
Concept Summary Page
For every biology term in the book, Inquire Biology presents a detailed concept summary page. Each of these pages begins with a human-authored definition of a concept (marked as 1 in figure 2). Most of these definitions are already available at the back of Campbell Biology, but some were added as part of the knowledge base (KB). The text definition is followed by key facts and relationships about that concept (marked as 2 in figure 2). These are automatically generated from the KB. Since the KB is specific to Campbell Biology, the concept summary never has the sort of unnecessary details students might find on Wikipedia or other general-purpose sites.
The sidebar of a concept summary page contains the figures from the textbook that are most relevant to that topic (marked as 3 on figure 2). Finally, questions relevant for further exploration of the topic are listed (marked as 4 in figure 2). Knowledge in a concept summary page can span multiple chapters, but the concept summaries put all the relevant facts and figures in one place. Unlike most glossaries, the Inquire Biology concept summary is not a dead end but can act as a catalyst to further learning--pop-up definitions and follow-up questions help students explore relationships between concepts and dig deeper into the material.
Students can ask questions at any time, but they may not always know what to ask. Inquire Biology is proactive, suggesting questions in a variety of contexts to help illustrate the types of questions Inquire Biology can answer.
The question-asking interface, shown in figure 3, is invoked by tapping the Q icon. The student can either enter a free-form question directly, or enter a list of biology terms, Inquire Biology will suggest questions related to free-form questions as the student types, helping students formulate their questions and offering alternatives for cases where Inquire Biology cannot understand the free-form question directly. These suggested questions are in three main categories: definitions and reference information, comparisons between two concepts, and questions that explore relationships, such as how structure affects function. These three types provide the best overlap between the system's capabilities and the types of questions students find useful.
In response to each question, inquire Biology returns a detailed answer with context to help students focus on what is important. Answers are designed to be concise and understandable, which means that different answers require different presentations.
Comparison questions compute the key differences between two concepts, building on the learning principle of conceptual contrast (Ausbel 196S). The result is displayed in a table, keeping the information well organized. Context is provided with short definitions and superclass information at the top, followed by important differences near the top. Figure 4 shows the answer to a comparison question about chloroplasts and mitochondria.
Answers to relationship questions are rendered as graphs (figure S), as these best highlight the connections among concepts, building on the learning principle of helping students make sense of concepts by exploring relationships among concepts (Scardamalia and Bereiter 2006). The graph is simplified from what might exist in the underlying KB, with relations designed to be clearly understandable to a biology student. Labels alone may not be enough -brief definitions of each term appear next to the graph, plus the links to follow-up questions.
For more difficult problems, our goal is to support student investigations. In accordance with the learning principle of scaffolding (Bliss, Askew, and Macrae 1996), Inquire Bioloqy links to related concepts and suggests follow-up questions, helping students ask the right questions and work their way toward a solution. By providing such scaffolding, students can make progress on more difficult investigations than they could handle unaided. In the current system, such scaffolding is always available to them. Further, by providing ready access to basic facts, Inquire Biology may reduce cognitive load (Sweller 1988) associated with information retrieval and allow students to stay focused on higher-order learning objectives.
Concept of Use
Inquire Biology is designed to support two educational use cases: active reading and homework support. It is certainly possible to use Inquire Biology in a classroom and for exam preparation, but we did not investigate such use in our work. Here, we illustrate how a student might use its capabilities in the context of active reading and homework support.
Active reading is reading with a purpose, usually to understand a concept or to answer a question. Active reading has a strong track record in empirical learning research (Palinscar and Brown 1984). As a learning strategy, it consists of four activities that take place before, during, and after reading a text: predict, ask, connect, and explain. The predict activity encourages students to look ahead before reading a passage. For example, students may skim a section, or look at headings, figures, and captions for organizational cues. The student combines this preview with prior knowledge to form predictions of how the topic fits into the larger themes of the domain and to anticipate the content that will follow. In the ask phase, students make questions to reflect on what they are learning and are encouraged to create practice test questions. In the connect phase, students relate the new content to related knowledge, including personal experience, current events, prior knowledge, and concepts in other subjects. In the explain phase, students restate the read content in their own words, through notes, summarizations, or diagrams, such as tables, flow charts, outlines, or concept maps. Following is a scenario to illustrate how a student might engage in active reading while using Inquire Biology:
I'm reading section 10.2 on photosynthesis. I have already highlighted some sentences in this section, and I encounter a term that I do not understand: thylakoid membrane. I tap on this term to see more information about it and I am taken to a concept summary page. I am first shown the definition: thylakoid membrane is the membrane surrounding the thylakoid. This definition is the same as the definition in the paper version of the textbook and not particularly useful. It is followed by additional information that is not directly available in the glossary of my paper textbook: that a thylakoid membrane is a type of double membrane. Further, I can see the parts that are unique to it: the electron transport chain and ATP synthase. I can tap on electron transport chain and view detailed information about its parts such as cytochromes. From this summary page, I can also see that the functions of a thylakoid membrane are light reaction and photophosphorylation. But, what about the functions of its parts? I can ask a question: What does an electron transport chain reaction do? I get an answer: noncyclic electron flow. As I return to the textbook, I can see that it is the next section that I am about to read.
In this example, Inquire Biology has supported active reading in the following ways:
Connect: The pop-up definitions and concept summary pages help students connect to additional material and allow a quick and seamless way to refresh and review previous knowledge. In this example, the student was able to review the structure and function of thylakoid membrane, and also obtained more detailed information about some of its parts--for example, electron transport chain.
Explain: Inquire Biology's highlighting and note-taking capabilities allow students to mark passages they find particularly important and to extend and summarize the information in their own words. Inquire Biology also explains the basics of a concept--for example, in this case it points out that a thylakoid membrane is a double membrane.
Ask: Inquire Biology suggests questions whenever a student makes a highlight. Related questions appear on concept summary pages and answers, and students can tap to see Inquire Biology's answer. In addition Inquire Biolqw allows the student to ask free-form questions. In this example, the student asked a question about the function of an electron transport chain, deepening the understanding of the overall concept of a thylakoid membrane.
Predict: Students can attempt to answer Inquire Biology's suggested questions on their own and test their predictions by tapping through to see the generated answers. In this example, the answer to a student's question about the function of an electron transport chain leads to a preview of what is coming up next in the textbook, appropriately setting up the student's learning goals.
Inquire Biology assists students in understanding and constructing answers for complex conceptual homework problems. Inquire Biology generally cannot answer complex homework questions directly, but provides support in three ways: (1) reducing the emphasis on memorization by providing quick access to facts and relationships needed to solve a homework problem, (2) comparing and relating concepts that may not necessarily be discussed in the same section of a textbook, and (3) deconstructing problems by suggesting simpler questions. We illustrate this capability by considering a homework problem from section 6.5 of Campbell Biology (figure 6). The scenario below suggests how a student might use Inquire Biology to understand and answer this problem.
To answer the first question, I ask Inquire Biology: What is the similarity between a chloroplast and a mitochondrion? Inquire Biology handles this question directly and gives me two nicely organized tables: one listing the similarities and the other the differences between a mitochondrion and a chloroplast. The answer to the similarity question tells me that both have a function of energy transformation. The similarity answer gives me little information about structure. So, I review the answer to the differences section (See figure 4). Under the structure section of the answer, I am told that a chloroplast has a chloroplast membrane and a mitochondrion has a mitochondrial membrane. They are named differently, but seem similar, and since the first problem asks me to consider structure, I ask the question: What is the similarity between a chloroplast membrane and a mitochondrial membrane? In the answer, I am told that they are both double membranes. I bet that this is one of the key structural features that should be an answer to the problem. To answer the second question, I can directly ask Inquire Biology factual questions such as: Is it true that plant cells have a mitochondrion? Inquire Biology tells me that the answer is: Yes. This straight fact retrieval reduces the need to memorize, but developing an explanation is more challenging. I could ask the question: What is the relationship between a mitochondrion and a plant cell? Inquire Biology presents a graph showing the relationship between the two. This helps me develop an explanation that a mitochondrion is a part of plant cells and functions during cellular respiration. The third question is open ended, and Inquire Biology helps me in deconstructing this question into simpler questions. When I highlight this question in Inquire Biology, it suggests several questions some of which are: What is an endomembrane system? What are elements/members of an endomembrane system? What are the differences between a mitochondrion and endoplasmic reticulum? What event results in mitochondrial membrane? What event results in endoplasmic reticulum membrane? None of these questions directly answers the third question, but they give me starting points for my exploration that might help me eventually construct an argument of the kind that this problem is asking for. Figure 6. An Example Homework Problem. Concept check 6.5 is from Biology (9th edition) by Nell A. Campbell and Jane B. Reece. Copyright (c) 2011 by Pearson Education, Inc. Used with permission of Pearson Education, Inc. CONCEPT CHECK 6.5 1. Describe two common characteristics of chloroplasts, and .mitochondria. Consider both function and membrane structure. 2. Do plant cells have mitochondria? Explain. 3. A classmate proposes that mitochondria and chloroplasts should be classified in the endomembrane system. argue against the proposal.
In the previous scenario, we can see that Inquire Biology does not directly answer a homework problem, but provides answers to portions of the question, which a student must then assemble to construct an overall answer. Through this process, it provides an easier access to facts and relationships, but the student is still required to develop a conceptual understanding of the material to answer a problem. By using Inquire Biology's question-answering facility, students may not necessarily be able to do their homework faster, but we do expect them to acquire a deeper understanding of material and be more engaged in doing their homework. The fact-retrieval questions as we saw in the solution of the second question help the student learn basic facts, and the comparison question that we saw in the first question relates pieces of information that may not be presented next to each other in the textbook. This reduces cognitive load and encourages a focus on conceptual understanding.
AI Technology in Inquire Biology
We now give an overview of the AI technology that enables the functioning of Inquire Biology. Detailed descriptions of various system components are available in previously published papers (Chaudhri et al. 2007, Clark et al. 2007, Gunning et al. 2010).
The overall system consists of modules for creating knowledge representation from the book, modules for performing inference and reasoning methods on the knowledge, and modules for asking questions and presenting answers to the student. Figure 7 illustrates the functional components of the system.
For the purpose of Inquire Biology, a subject-matter expert (SME) is a biologist with a bachelor's degree in biology or a related discipline. The SME uses a knowledge authoring system and the authoring guidelines to represent knowledge from the textbook contributing to a growing knowledge base. Inquire Biology uses AURA as its knowledge authoring system. Details on AURA have been published in AI Magazine (Gunning et al. 2010). The authoring guidelines specify a process by which SMEs systematically read the textbook content, arrive at a consensus on its meaning, encode that meaning, test their encoding by posing questions, and validate the knowledge for its educational utility.
AURA uses the knowledge machine (KM) as its knowledge representation and reasoning system (Clark and Porter 2012). KM is a frame-based representation language that supports a variety of representational features including a class taxonomy, a relation taxonomy, partitions (disjointness and covering axioms), rules (attached to a frame's slots), and the use of prototypes (canonical examples of a concept). The knowledge base itself uses an upper ontology called the component library or CLIB (Barker, Porter, and Clark 2001). The CLIB is a domain-independent concept library. The SMEs access the CLIB through AURA and use it to create domain-specific representations for knowledge in Campbell Biology, resulting in a biology knowledge base. The current knowledge base has been well-tested for the questions used in Inquire Biology for chapters 2-12 of Campbell Biology.
KM also provides the core inference services for AURA. KM performs reasoning by using inheritance, description-logic-style classification of individuals, backward chaining over rules, and heuristic unification (Clark and Porter 2012). A focus of innovation in our work has been to identify these core features and to specify them in a declarative manner (Chaudhri and Tran 2012). We export the biology KB in a variety of standard declarative languages, for example, first-order logic with equality (Fitting 1996), SILK (Grosof 2009), description logics (DLs) (Baader et al. 2007) and answer-set programming (Gelfond and Lifschitz 1990).
In addition, AURA incorporates several special-purpose reasoning methods for answering comparison and relationship reasoning questions (Chaudhri et al. 2013). We briefly explain the computation used in these methods. For comparison questions, we first compute the description of a typical instance of each class in terms of its types and locally asserted slot values and constraints. Next, we compute the similarities and differences between the two descriptions. The result is organized into a table. We use a variety of presentation heuristics to identify the most important similarities and differences and to present the differences that correspond to each other in the same row of the table. As an example heuristic, a slot that indeed has a value that is different is shown before a slot in which there is a value for one class but not a value for another. The presentation heuristics also rely on input from biologists on how the slots should be organized. For example, the slots such as has part and has region are grouped together as they correspond to structural information. To compute the relationship between two individuals, we first calculate all paths in the knowledge base between two individuals. Since calculating all paths is expensive, we use heuristics to control the search. As an example, we first explore only taxonomic relationships and then relationships that can be found in a single concept graph. The computed paths are ranked in the order of importance. As in the case of comparison questions, the importance is computed using heuristics provided by the biologists. As an example, the paths involving only structural slots are preferred over paths that contain arbitrary slots.
Inquire Biology works through an HTTP connection to an AURA server running on a Windows machine. The AURA server contains the biology knowledge base and supports the question-answering and question-suggestion facilities, the details of which have been previously published (Gunning et al. 2010). AURA also pre-computes all the concept summary pages, which are included in the Inquire Biology application during the build process.
The question-answering facility in AURA relies on a controlled natural language understanding system that was described by Clark et al. (2007) and Gunning et al. (2010). With the direct use of controlled natural language, the students often had difficulty anticipating which questions the system can or cannot handle. To address this issue, AURA supports a suggested question facility. In response to questions entered by the student, AURA computes the most closely matching questions that can be answered and presents these suggestions to the student. This computation relies on a large database of questions that are automatically generated by traversing the knowledge base. For example, for two concepts that are siblings of each other in the class taxonomy, the system will generate a comparison question that asks for similarities and differences between the two concepts. Questions are generated that conform to the templates that are supported in the system. As the student types a question and even partially enters questions, the precomputed database of questions is used to find most closely matching questions and propose them as suggestions. This approach substantially reduces the occurrence of questions that cannot be parsed by AURA and greatly enhances the usability of the question-answering facility. Our approach to question generation is similar to the spirit of other recent work on question generation (4) with one important difference: our system has access to a detailed knowledge representation of the textbook that can be used for question generation while most other question-generation systems attempt to generate questions from the surface representation of the text.
After the inference and question-answering components produce the basic content of an answer, the answer-generation module constructs the full, human readable answer. Each kind of answer has a specific screen layout and uses a mixture of presentation types that include bulleted lists, graphs, and natural language statements to describe the base facts from the knowledge base. The answers leverage a natural language generation component that automatically generates English sentences from the KB (Banik et. al. 2012).
The Evaluation of Inquire Biology
The goal of evaluating Inquire Biology was to assess the extent to which the AI enhancements to Campbell Biology were useful to students for the active reading and homework support tasks, and to determine whether Inquire Biology leads to better learning. The formal evaluation of Inquire Biology was preceded by a series of user studies, which we used to refine Inquire Biology's capabilities and ensure that the system was usable. In discussing the evaluation, we describe the experimental design, participant profile, procedure used, and both qualitative and quantitative results.
Our evaluation used a between-subjects research design, with students randomly assigned to one of three conditions. The full Inquire Biology group (N = 25) used the system version that had all the AI-enabled features that we discussed earlier in this paper. The textbook group (N = 23) used a standard print copy of Campbell Biology. The ablated Inquire Biology group (N = 24) used a version of Inquire Biology that lacked the concept summaries and question-answering capabilities but included all the other ebook features such as highlighting and annotation. The number of subjects in each group was chosen to ensure that the results are statistically significant.
The purpose of these three groups was to compare studying with Inquire Biology to two forms of existing practice: studying with a paper textbook and studying with a conventional ebook reader version of a textbook. We wished to control for unexpected effects that may arise from the iPad platform, or idiosyncrasies with the Inquire Biology interface. Specifically, we wanted to avoid criticisms that any improvements observed by using Inquire Biology were simply due to the excitement of using an iPad and that use of an unenhanced electronic textbook lowers student performance as compared to using a paper textbook.
In actual practice, the students might use lecture notes, Wikipedia, and other teacher-provided resources during active reading and homework problem solving. However, we did not incorporate these into our study because of the huge variation in available supplementary resources.
We recruited 72 participants from a local community college. The number of subjects was chosen to ensure that the study had sufficient power to detect expected differences. All the participating students were enrolled in an introductory biology course that used the same edition of the Campbell Biology textbook that is contained within Inquire Biology. The students were prescreened to ensure that they had not previously covered the sections of the text used in the evaluation. Based on the student background variables, such as grade point average, age, gender, educational background, native language, and familiarity with iOS devices, the three groups were comparable at the onset of the study. All students were financially compensated for their time; they did not receive any academic credit for their participation; and, there was no attrition of the participants during the exercise.
All groups performed an identical learning task: they were asked to study a unit on cellular membrane structure and function. We chose this topic as it is fundamental to cell biology, and students frequently have difficulty understanding the material. Students' specific learning objectives were to (1) understand specific components of membranes and their functions, (2) understand what specific components of membranes contribute to their selective permeability, and (3) explore how structure influences function at molecular and macromolecular levels.
The evaluation took place outside of the classroom to ensure that study conditions were the same for each group. Each session took place on a single day and lasted between four and six hours depending on the group. The evaluation consisted of four primary components: introduction and training, active reading task, homework support task, and a posttest. In addition, a researcher interviewed students in the Inquire Biology groups afterwards to better understand their experience.
The introduction and training consisted of an orientation to the study and a 30-minute training exercise designed to familiarize participants with the process of active reading and how they could use Inquire Biology for homework problem solving. The training on active reading and homework problem solving was conducted by a biology teacher.
The active reading training introduced the students to habits of active reading (predict, ask, connect, and explain), and illustrated these habits using an example. The example used for this training was gravitation, and was specifically chosen to be unrelated to the topic of the actual evaluation. After a lecture on active reading, students were given a scripted tutorial on active reading. This tutorial used a section from chapter 6 of Campbell Biology that concerned cell structure and provided step-by-step instructions on how a student could engage in active reading using the features of Inquire Biology, ablated Inquire Biology, or the print version of the text.
The training segment for problem solving focused on understanding homework problems, different question types, understanding answers, and constructing follow-up questions. The training also covered question formulation for questions that contain "how" and "why" in the question statement. Many such questions can be rephrased as questions that are supported in Inquire Biology ("How" and "Why" questions are not directly supported). The training segment also included a five-minute video illustrating how a student can use Inquire Biology for solving a nontrivial homework problem.
Most of the active reading training was administered to all three groups. However, only the full Inquire Biology group received training on using Inquire Biology features for active reading and homework problem solving.
The problem sets used for homework and the posttest were designed by a biology instructor. The homework problem set had six problems (with several subparts), and the posttest had five problems. Both problem sets included problems that the instructor would normally use during the teaching of the course and were not specifically aligned to the capabilities of Inquire Biology. The problem sets were tested for any obviously confusing information by first trying them with students in the user studies preceding the evaluation. This ensured that any confusing information about the problem statement would not confound the results. Students were able to use their book while doing the homework exercises but not during the final posttest.
Students were asked to read the first two sections of chapter 7 from Campbell Biology in 60 minutes, do homework in 90 minutes, and then take a posttest in 20 minutes (table 1). The exercise was followed by an informal discussion session in which the students completed usability and technology-readiness surveys and provided qualitative feedback. The students' homework and posttest were each scored by two teachers, and score discrepancies were resolved. All student work was coded such that the teachers had no knowledge of which student or condition they were scoring.
Quantitative Results from The Posttest
Figure 8 shows the quantitative posttest results. Both the homework and posttest tasks had a maximum score of 100. The scores in figure 8 show the mean score of students in each group.
Our primary interest is in the effect on the posttest score of Inquire Biology as compared to the two contrasting, more typical conditions. The mean posttest score of 88 for the Inquire Biology group was higher than both the corresponding score of 75 for the ablated Inquire Biology group and the score of 81 for the textbook group. The difference between the Inquire Biology group and the ablated Inquire Biology group was statistically significant (p value = 0.002). The difference between the Inquire Biology group and the textbook group was also significant (p value = 0.05). These differences suggest that students who used Inquire Biology learned more.
The mean homework score of 81 for the Inquire Biology group was higher than both the corresponding score of 74 for the ablated Inquire Biology group and a score of 71 for the textbook group. The difference between the Inquire Biology group and the ablated Inquire Biology group was not statistically significant (/2 value = 0.12), but the difference between the Inquire Biology group and the textbook group was significant (p value = 0.02). The observed trend is consistent with our hypothesis that Inquire Biology enhances learning by helping students perform better on homework.
The comparison between the paper textbook and the ablated version of Inquire Biology is also interesting. Although the mean homework score of 74 for ablated Inquire Biology and a mean score of 71 for the paper textbook differ, there was no statistical difference between these conditions. Similarly, there was no statistical difference between the homework scores of the two conditions (p value = 0.18). This result is consistent with the prior research that simply changing the medium of presentation has no impact on student learning (Means et al. 2009).
We gathered extensive data on how the different features of Inquire Biology were used but did not discover any significant correlation between the usage of a particular feature and the improvement on posttest performance.
We did not explicitly measure to what extent students followed the specific steps of active reading, or to what extent they acquired deeper understanding of knowledge, or whether they were more engaged with the material. The posttest scores are a measure of depth of understanding, and increased engagement was apparent in their qualitative feedback, which we discuss next. Additional analysis could be done by analyzing their note taking behavior, but since it was orthogonal to the core AI features that were the subject of the study, we left it open for future work.
We gathered three forms of qualitative data to assess the usefulness of Inquire Biology. The data included the usage of Inquire Biology features as well as results from a usability survey and a technology readiness survey.
We tracked the questions asked by students and the concept summary pages visited by them. All students in the full Inquire Biology group asked questions and visited concept summary pages. Students viewed a total of 81 concept summary pages during the exercise; 38 of these were viewed by at least two students. Approximately 30 percent of the questions were of the form "What is X?"; another 30 percent were factual questions; and the remaining 40 percent were evenly split between comparison questions and relationship questions. This usage data is suggestive of the usefulness of these features to students (see figure 9).
During the active reading task, students cumulatively asked 120 questions, and during the homework task they asked a total of 400 questions. This difference is expected, as students naturally have a greater need for asking questions during the homework task. During active reading, the question asking predominantly came from students clicking on questions suggested in response to their highlighting of text. During homework problem solving, however, students were more prone to use Inquire Biology's question-asking dialogue. A total of 194 unique questions were asked by the students, out of which only 59 questions were asked by two or more students. The variety of questions would make it very difficult to anticipate all questions in the textbook itself. This suggests that automatic question answering can be a useful addition to a textbook.
In figure 10, we show the actual questions that the students asked and how many different students asked each question. It can be seen that there is a good spread of different question types even though the questions of the form "What is X" predominated the mix.
After use, the students rated the usability of Inquire Biology using the SUS scale (Brooke 1996). The SUS score for ablated Inquire Biology was 84.79 with a standard deviation of 11.82, whereas the score for full Inquire Biology was 84.4 with a standard deviation of 9.02. This suggests a high degree of subjective satisfaction in using Inquire Biology as well as no degradation in usability from the addition of AI-based features.
The students also completed a survey to provide subjective ratings of various features of Inquire Biology. The students were asked to indicate whether a particular feature was ready for use, ready for use with some improvements, or required major improvements. The concept summary pages had the highest score with 76 percent of students rating them as ready to use in the present form, 16 percent rating them as requiring minor improvement, and 8 percent rating them as requiring major improvements. For the question-answering facility, 51 percent of the students rated it as ready to use in present form, 37 percent rated it as requiring minor improvement, and 12 percent rated it as requiring major improvements.
In the posttask debrief, students in the Inquire Biology group unanimously reported that they would like to use the system for their class. They reported that Inquire Biology "motivates you to learn more" and "helps you stay focused." Students also reported that they felt less distracted as they could use Inquire Biology to get additional information without losing context. Usage logs show students made use of question answering and concept summary pages during the active reading task, which suggests that they were engaged in the content and actively seeking out related and supporting information. A more extensive evaluation of how Inquire Biology improves engagement is open for future work.
Overall, the qualitative assessment of the Inquire Biology features was positive but also indicated some areas for improvement. Students reported that the comparison answers were "well organized" and "to the point," but they also commented that the range of questions supported needs to be expanded and the answer quality still needs to be improved.
Generalizabiltiy and Scalability
Let us now consider how the process of knowledge base construction generalizes and scales potentially to textbooks other than biology. Our most substantial experience in using AURA is with Campbell Biology. In addition to biology, we have prior experience in using AURA for physics and chemistry (Gunning et al. 2010). We have also performed a design-level analysis for the domains of government and politics, microeconomics, and environmental science. Within the domain of biology, we have considered AURA's applicability to representing a middle school textbook and to advanced textbooks on cell biology and neuroscience. We can draw the following conclusions from these studies.
Conceptual knowledge (ability to define classes, subclasses, stating disjointness, defining slot values and rules), mathematical equations, and qualitative knowledge (for example, directly or inversely proportional) cut across all domains and are widely applicable. Such knowledge can account for at least half of the knowledge that needs to be captured in a textbook. Inquire Biology demonstrates that a useful enhancement to an electronic textbook can be built using these core forms of knowledge. The approach is directly applicable with little changes to any textbook that is comparable in scope to Campbell such as Mason, Losos, and Singer (2011) or a middle school textbook (Miller and Levine 2010). The conceptual knowledge, qualitative knowledge, and mathematical equation features of AURA generalize to any of the domains mentioned above.
Outside the context of an AI research project, one possible approach to achieve scalability of the approach in practice is for the knowledge representation of the content of the book to become an integral part of the book authoring process. This new requirement is no different than the preparation of table of contents, index, or glossary that is typically found in standard textbooks. The books of the future could include knowledge representation for the most salient portions of the content, which could enable numerous pedagogical features some of which are illustrated in Inquire Biology.
AURA does not cover all forms of knowledge found across multiple domains. Some domains are more mathematical than others, for example, physics, algebra, or economics. For such domains, the reasoning capability of the system needs to include mathematical problem solving. The textbooks such as chemistry and algebra also require capturing procedural knowledge. A typical example of such knowledge in chemistry involves steps for computing pH of a buffer solution. While solutions exist for mathematical problem solving and capturing procedural knowledge, their combination with conceptual knowledge requires novel research. The background knowledge provided in CLIB needs to be extended for each textbook, requiring some research in conceptual modeling and application of this research by the domain experts in the respective domain. New textbook subjects also require developing new question types and answer-presentation methods.
The work presented here directly overlaps with three related areas of research: electronic textbooks, question-answering systems, and reading for comprehension. In this section, we briefly situate our work in these three areas of related research.
To inform the initial design of Inquire Biology, in early 2010 we evaluated existing ebook apps and electronic textbooks to develop a list of key features and solutions to common problems. Based on this analysis, we determined that students would expect Inquire Biology to save their places in the book and make it easy to take notes, highlight text, zoom in on images, and search the concept summaries. We also saw many approaches to displaying the highly structured content of textbooks, from scaled-down images of actual pages (Kno) (5) to entirely custom iPad-optimized layouts (Apple iBooks textbooks). We chose to render the book content as a vertically scrolling HTML page, as that gave us an optimal combination of readable text, flexible layouts, good performance, and a reasonable level of effort.
Standing apart from the other iPad textbook apps is Inkling, (6) an app released in August 2010, around the time we built the first alpha versions of Inquire Biology. Inkling 1.0 had all the major features discussed above and, like Inquire Biology, rendered content as speedy HTML rather than the slower scans of book pages. Since 2010, Inkling has added several features that may address some of the specific needs we target with Inquire Biology: providing definitions of key terms through integrating Wikipedia into the app's search feature, and shared notes that allow students to communicate with their classmates and ask each other questions about the material. Because AURA's answers never stray from the content of Campbell Biology, they are more accurate and better scoped than what one might get from Wikipedia or from one's peers. Inkling's work is however an important step forward for students and an indication of the degree of rapid improvement that is occurring in the electronic textbook domain.
A survey of recent question-answering systems has been published in AI Magazine (Gunning, Chaudhri, and Welty 2010). For the present discussion, we will compare AURA to Watson and Wolfram Alpha.
Watson is a recent question-answering system from IBM that outperformed humans in the game of Jeopardy (Ferrucci 2012). Watson is an impressive system that showed performance over a broad domain of question answering. One of the primary characteristics of the Jeopardy problem is that 94.7 percent of the answers to questions are titles of some Wikipedia page (Chu-Carroll and Fan 2011). The remaining 5.3 percent of the answers that are not the titles of the Wikipedia pages include multiple entities such as red, white, and blue, or even short sentences or phrases, such as make a scare cow. Since there are more than 3 million Wikipedia pages and a huge variety of questions, this is not an easy task, yet it is constrained in a way that makes it amenable to solution by machine-learning techniques. In contrast, the answers returned by AURA can span from a paragraph to a page--especially the answers to the comparison and relationship reasoning questions. Furthermore, these answers are not always stated in the textbook, and even if they are, they may be stated in different parts of the textbook. In addition, Inquire Biology is an interactive user-centered application and has much higher usability demands.
Wolfram Alpha is a question-answering system based on Mathematica, a computational reasoning engine. (7) Wolfram Alpha relies on well-defined computations over curated data sets. For example, when given a question such as "orange and banana nutrition?" it looks up the nutritional information for both orange and banana, adds them, and presents the net result. When given an equation such as g(n + 1) = [n.sup.2]+g(n), Wolfram Alpha gives its solution and presents a plot showing the value of the solution as a function of n. Wolfram Alpha is similar in spirit to AURA in the sense that it provides well-defined computations over curated data, but very different in terms of the curated data and the specific computation. For AURA, the knowledge is curated from a biology textbook, and the computations are based on logical reasoning rather than mathematics.
Achieving the potential offered by ebook technology will depend on a rich understanding of how students learn from college-level science textbooks. Past research indicates that as the information complexity of such textbooks increases, the explanatory clarity, or text cohesion, declines (Graesser et al. 2004). To succeed, science students must move from decoding to sense-making, which involves making connections between prior knowledge and new information (Kintsch 1988; Palinscar and Brown 1984; Scardamalia et al. 1996), yet research indicates that many college students find it difficult to read science textbooks because of gaps in their understanding of science concepts (diSessa 1993; Driver and Easley 1978; Ozuru, Dempsey, and McNamara 2009).
Inquire Biology features promote active reading strategies that can help readers construct better models of the concepts they are learning. Priming readers to focus on their goals or the text structure before reading can serve as a form of advance organizer that cues existing knowledge, supports strategic reading, and permits readers to mentally record new knowledge more effectively (Ausubel and Youssef 1963; Mayer 1979). Providing readers with opportunities to check their knowledge as they read supports active self-regulation of learning (Bransford, Brown, and Cocking 1999; Brint et al. 2010). Prompting readers to pose questions as they read is also useful for metacognitive monitoring, improving comprehension, and making connections among ideas (Novak 1998; Pressley and Afflerbach 1995; Udeani and Okafor 2012). In addition, coupling visual elements in textbooks, such as diagrams, with text can also have quite robust effects on student learning (Mayer 2003), especially for students who prefer visual information (Mayer and Massa 2003).
Inquire Biology is an early prototype of an intelligent textbook. Our results show the promise of applying AI technology to electronic textbooks to enhance learning. To fully realize this promise, more work needs to be done in at least the following categories: improving the representation, improving question-answering capability, conducting more extensive education studies, and improving the learning gains. We discuss each of these directions in more detail.
Improving Representation and Question Answering
As of early 2013, Inquire Biology provides very good coverage of chapters 2-12 of Campbell Biology, and some coverage of chapters 13-21, 36, and 55. It currently handles factual, comparison, and relationship reasoning questions. The knowledge representation needs to be both deepened and expanded to cover the remaining chapters. The current representation handles knowledge about structure and function, process regulation, and energy transfer. Additional representation constructs are needed in CLIB for representing experiments, evolution, continuity and change, and interdependence in nature. Expanding the knowledge base to the full textbook requires ensuring the scalability of the knowledge authoring system in AURA. Additional work also needs to be done on developing new reasoning methods such as for process interruption reasoning and to expand the range of questions that the system currently suggests. The usability of the system will also improve substantially by expanding the range of English that can be accepted as input questions.
More Extensive Educational Studies
The evaluation considered in this article lasted for only a few hours over a single day. It remains to be shown if similar improvements in student learning can be obtained if the students are allowed to take Inquire Biology home and study from it over a period of several weeks. The current study suggested that Inquire Biology may be especially useful for lower-performing students; additional students need to be tested to validate this trend. We also need to better understand which specific features of Inquire Biology contributed to the improvement in learning that we observed.
Improving Learning Gains
Current results show that the use of Inquire Biology improved the posttest scores by approximately 10 percent. A natural question is whether this is the maximum improvement possible. Several new educational supports can be included in Inquire Biology to support a better model of student comprehension, identifying student-specific problem areas to provide targeted support, and incorporating better problem solving dialogue capabilities. In the long term it could have a full-fledged intelligent tutoring system.
Summary and Conclusions
Inquire Biology is an innovative application of AI technology to an electronic textbook. The key ideas in Inquire Biology are to provide automatically generated concept summary pages and suggested questions, and to provide answers to comparison and relationship questions. This is enabled by taking advanced AI technologies and blending them into an ebook to provide a seamless user experience. Most natural language question-asking interfaces suffer from the problem that the student does not have a clear sense of what kind of questions the system can handle. Inquire Biology addresses this by the use of suggested questions. The evaluation results show that students find Inquire Biology usable and engaging, and that they learned more. Although this result is from a preliminary evaluation study, it is an important and significant result as it is one of the first to conclusively demonstrate the usefulness of explicitly represented knowledge and question answering as students study from their assigned science textbook.
When students transition from print textbooks to electronic textbooks, they will learn more only if the new media better supports their learning process. To date, most work with electronic textbooks reproduces only capabilities such as highlighting and annotation, which are already available for paper textbooks. By using knowledge representation, question-answering, and natural language generation techniques from AI, we have shown that an electronic textbook can go beyond what is possible in paper books. Specifically, knowledge representation can support students in asking and answering questions about the relationships among the large number of concepts that are newly introduced in challenging science courses. By helping students understand the relationships among concepts, an AI-enriched textbook has the potential to increase students' conceptual understanding as well as their satisfaction with study materials. We hope that Inquire Biology will provide inspiration for a variety of AI methods to provide supports for personalization and leveraging online resources in electronic textbooks that facilitate the development of inquisitive scientists and engineers that are so much needed.
This work has been funded by Vulcan Inc. The authors wish to thank the members of the Inquire Biology development team: Eva Banik, Roger Corman, Nikhil Dinesh, Debbie Frazier, Stijn Heymans, Sue Hinojoza, Eric Kow, David Margolies, Ethan Stone, William Webb, Michael Wessel, and Neil Yorke-Smith.
Agrawal, R.; Gollapudi, S.; Kannan, A.; and Kenthapadi, K. 2011. Data Mining for Improving Textbooks. SIGKDD Explorations 13(2): 7-10.
Ausubel, D. P. 1965. A Cognitive Structure View of Word and Concept Meaning. In Readings in the Psychology of Cognition, ed. R. C. Anderson and D. P. Ausubel, 103-115. New York: Holt, Rinehart and Winston, Inc. Ausubel, D. P., and Youssef, M. 1963. Role of Discriminability in Meaningful Paralleled Learning. Journal of Educational Psychology 54(6): 331-331.
Baader, F.; Calvanese, D.; Mcguinness, D. L.; Nardi, D.; and Patel-Schneider, P. F., eds. 2007. The Description Logic Handbook: Theory, Implementation, and Applications, 2nd ed. Cambridge, UK: Cambridge University Press.
Banik, E.; Kow, E.; Chaudhri, V.; Dinesh, N.; and Oza, U. 2012. Natural Language Generation for a Smart Biology Textbook. Paper presented at the International Conference on Natural Language Generation, Utica Il, 30 May-1 June.
Barker, K.; Porter, B.; and Clark, P. 2001. A Library of Generic Concepts for Composing Knowledge Bases. In Proceedings of the 1st International Conference on Knowledge Capture, 14-21. New York: Association for Computing Machinery.
Bliss, J.; Askew, M.; and Macrae, S. 1996. Effective Teaching and Learning: Scaffolding Revisited. Oxford Review of Education 22(1): 37-61.
Bransford, J. D.; Brown, A. L.; and Cocking, R. R. 1999. How People Learn: Brain, Mind, Experience, and School. Washington, DC: National Research Council.
Brint, S.; Douglass, J. A.; Thomson, G.; and Chatman, S. 2010. Engaged Learning in the Public University: Trends in the Undergraduate Experience. Report on the Results of the 2008 University of California Undergraduate Experience Survey. Berkeley, CA: Center for the Studies in Higher Education. Brooke, J. 1996. Sus: A Quick and Dirty Usability Scale. In Usability Evaluation in Industry, ed. P. W. Jordan, B. Thomas, B. A. Weerdmeester, and A. L. McClelland. London: Taylor and Francis.
Chaudhri, V. K.; Heymans, S.; Tran, S.; and Wessel, M. 2013. Query Answering in Object-Oriented Knowledge Bases. Paper presented at the 6th Workshop on Answer Set Programming and Other Computing Paradigms, Istanbul, Turkey, 24-29 August.
Chaudhri, V. K., and Tran S. C. 2012. Specifying and Reasoning with Underspecified Knowledge Bases Using Answer Set Programming. In Proceedings of the 13th International Conference on Knowledge Representation and Reasoning. Palo Alto, CA: AAAI Press.
Chaudhri, V.; John, B. E.; Mishra, S.; Pacheco, J.; Porter, B.; and Spaulding, A. 2007. Enabling Experts to Build Knowledge Bases from Science Textbooks. In Proceedings of the 4th International Conference on Knowledge Capture. New York: Association for Computing Machinery.
Chu-Carroll, J., and Fan, J. 2011. Leveraging Wikipedia Characteristics for Search and Candidate Generation. In Proceedings of the 25th AAAI Conference on Artificial Intelligence. Palo Alto, CA: AAAI Press.
Clark, P. E., and Porter, B. 2012. KM--The Knowledge Machine 2.0 User's Guide. Technical Report, Dept. of Computer Science, University of Texas at Austin.
Clark, P.; Chaw, S.-Y.; Barker, K.; Chaudhri, V.; Harrison, P.; Fan, J.; John, B.; Porter, B.; Spaulding, A.; Thompson, J.; and Yeh, P. Z. 2007. Capturing and Answering Questions Posed to a Knowledge-Based System. In Proceedings of the 4th International Conference on Knowledge Capture. New York: Association for Computing Machinery.
Disessa, A. A. 1993. Toward an Epistemology of Physics. Cognition and Instruction 10(23): 105-225.
Driver, R., and Easley, J. 1978. Pupils and Paradigms: A Review of Literature Related to Concept Development in Adolescent Science Students. Studies in Science Education 5(10): 61-84.
Ferrucci, D. A. 2012. This Is Watson. IBM Journal of Research and Development 56(3-4). Fitting, M. 1996. First-Order Logic and Automated Theorem Proving. Berlin: Springer.
Gelfond, M., and Lifschitz, V. 1990. Logic Programs with Classical Negation. In Logic Programming: Proceedings Seventh International Conference, ed. D. Warren and P. Szeredi, 579-597. Cambridge, MA: The MIT Press.
Graesser, A. C.; McNamara, D. S.; Louwerse, M. M.; and Cai, Z. 2004. Coh-Metrix: Analysis of Text on Cohesion and Language. Behavior Research Methods 36(2): 193-202.
Grosof, B. N. 2009. Silk: Higher Level Rules with Defaults and Semantic Scalability. In Web Reasoning and Rule Systems, Proceedings of the Third International Conference, Volume 5837, Lecture Notes in Computer Science, 24-25. Berlin: Springer.
Gunning, D.; Chaudhri, V. K.; Clark, P.; Barker, K.; Chow, S.-Y.; Greaves, M.; Grosof, B.; Leung, A; McDonald, D.; Mishra, S.; Pacheco, J.; Porter, B.; Spaulding, A.; Tecuci, D.; and Tien, J. 2010. Project Halo Update--Progress Toward Digital Aristotle. AI Magazine 31(3).
Gunning, D.; Chaudhri, V.; and Welty, C., eds. 2010. Special Issue on Question Answering Systems. AI Magazine 31(3). Kintsch, W. 1988. The Use of Knowledge in Discourse Processing: A Construction-Integration Model. Psychological Review 95(2): 163-182.
Mason, K.; Losos, J.; and Singer, S. 2011. Biology, 9th ed. New York: McGraw Hill.
Mayer, R. E. 2003. The Promise of Multimedia Learning: Using the Same Instructional Design Methods Across Different Media. Learning and Instruction 13(2): 125-139.
Mayer, R. E. 1979. Can Advance Organizers Influence Meaningful Learning? Review of Educational Research 49(2): 371-383.
Mayer, R. E., and Massa, L. J. 2003. Three Facets of Visual and Verbal Learners: Cognitive Ability, Cognitive Style, and Learning Preference. Journal of Educational Psychology 95(4): 833-833.
Means, B.; Toyama, Y.; Murphy, R.; Bakia, M.; and Jones, K. 2009. Evaluation of Evidence Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. Menlo Park CA: SRI International.
Miller, K., and Levine, J. 2010. Biology, 2nd ed. Boston: Pearson.
National Research Council. 1999. How People Learn: Brain, Mind, Experience, and School. Washington, DC: National Academy Press.
Novak, J. D. 1998. Learning, Creating, and Using Knowledge: Concept Maps as Facilitative Tools in Schools and Corporations. Journal of E-Learning and Knowledge Society 6(3): 21-30.
Ozuru, Y.; Dempsey, K.; and McNamara, D. S. 2009. Prior Knowledge, Reading Skill, and Text Cohesion in the Comprehension of Science Texts. Learning and Instruction 19(3): 228-242.
Palinscar, A. S., and Brown, A. L. 1984. Reciprocal Teaching of Comprehension-Fostering and Comprehension-Monitoring Activities. Cognition and Instruction 1(2): 117-175.
Pera, M., and Ng, Y. 2012. Personalized Recommendations on Books for K-12 Students. In Proceedings of tire Fifth ACM Workshop on Research Advances in Large Digital Book Repositories and Complementary Media. New York: Association for Computing Machinery.
Pressley, M., and Afflerbach, P. 1995. Verbal Protocols of Reading: The Nature of Constructively Responsive Reading. Mahwah, NJ: Lawrence Erlbaum.
Reece, J.; Urry, L.; Cain, M.; Wasserman, S.; Minorsky, P.; and Jackson, R. 2011. Campbell Biology, 9th ed. Boston: Benjamin Cummings.
Scardamalia, M., and Bereiter, C. 2006. Knowledge Building: Theory, Pedagogy, and Technology. The Cambridge Handbook of the Learning Sciences, 97-115. Cambridge, UK: Cambridge University Press.
Scardamalia, M.; Bereiter, C.; Hewitt, J.; and Webb, J. 1996. Constructive Learning from Texts in Biology. In Relations and Biology Learning: The Acquisition and Use of Knowledge Structures in Biology, ed. M. K. K. M. Fischer, 44-64. Berlin: Springer.
Sweller, J. 1988. Cognitive Load During Problem Solving: Effects on Learning. Cognitive Science 12(2): 257-285.
Udeani, U., and Okafor, P. N. 2012. The Effect of Concept Mapping Instructional Strategy on the Biology Achievement of Senior Secondary School Slow Learners. Journal of Emerging Trends in Educational Research and Policy Studies 3(2): 137-142.
(1.) This data is based on our analysis of a specific biology textbook and measuring the number of new concepts and relationships that needed to be encoded from it.
(2.) Different groups of students and courses can have varying requirements. For example, students studying for an advanced placement exam need to be responsive to the requirements of that exam. We have made no special effort to customize Inquire Biology to such specific needs.
(3.) See www.projecthalo.com.
(4.) See www.questiongeneration.org/.
(5.) See www.kno.com.
(6.) See www.inkling.com.
(7.) See www.wolframalpha.com/about.html.
Vinay K. Chaudhri received his Ph.D. in computer science from University of Toronto, Canada, and is currently a program director in the Artificial Intelligence Center at SRI International. His research focuses on the science, engineering, and application of large knowledge base systems.
Britte Haugan Cheng is a senior education researcher at SRI International's Center for Technology in Learning. She received her Ph.D. from University of California, Berkeley's Cognition and Development Program in the Graduate School of Education. Her research focuses on the design and implementation of STEM learning technologies, instruction, and assessments.
Adam Overholtzer is an interaction designer and user interface developer at SRI International, where he designed and built the Inquire iPad app. Overholtzer likes to work in pixels, in code, and in his free time, Lego.
Jeremy Roschelle specializes in the design and development of integrated interventions to enhance learning of complex and conceptually difficult mathematics and science; learning sciences-based research in mathematics education, on collaborative learning, and with interactive technology; and the management of large-scale multiyear, multi-institutional research and evaluation projects.
Aaron Spaulding is a senior computer scientist and interaction designer at SRI's Artificial Intelligence Center. His work centers on developing usable interfaces for AI systems that meet real user needs. He holds a Master's degree in human computer interaction from Carnegie Mellon University.
Peter Clark is a senior research scientist at Vulcan Inc. working in the areas of natural language understanding, question answering, machine reasoning, and the interplay between these three areas. He received his Ph.D. in computer science from Strathclyde University, UK, in 1991.
Mark Greaves is technical director for analytics at the Pacific Northwest National Laboratory (PNNL). Prior to joining PNNL, he was director of knowledge systems for Vulcan Inc., where he oversaw the AURA and Inquire Biology work reported on here. Prior to joining Vulcan, he served as a program manager at the U.S. Defense Advanced Research Projects Agency, where he led numerous projects in AI and semantic web technology.
David Gunning is the program director for enterprise intelligence at the Palo Alto Research Center (PARC). He directs PARC's efforts in artificial intelligence and predictive analytics focused on the enterprise. These include projects in anomaly and fraud detection, contextual intelligence, recommendation systems, and tools for smart organizations. Prior to PARC, Gunning was a senior research program manager at Vulcan Inc., a program manager at DARPA (twice), SVP of SET Corp., vice president of Cycorp, and a senior scientist in the Air Force Research Labs. At DARPA, he managed the Personalized Assistant that Learns (PAL) project that produced Siri, which was acquired by Apple, and the Command Post of the Future (CPoF) project that was adopted by the U.S. Army for use in Iraq and Afghanistan. Gunning holds an M.S. in computer science from Stanford University, an M.S. in cognitive psychology from the University of Dayton, and a B.S. in psychology from Otterbein College.
Table 1. Typical Structure for Each Condition. 1 hour 2 intro Training Active Lunch reading task (60 minutes) 3 4 5 intro Homework Posttest Debrief support task (20 (90 minutes) minutes) Figure 3. Asking Questions with Assistance. I ask a question define Define cellular respiration structure What is the structure of a chloroplast? function What is the function of a plasma membrane in a eukaryotic cell? compare What are the differences between chloroplasts and mitochondria? relate If the chloroplasts were removed from a plant, what events would be affected? search Search book for photosynthesis Figure 8. Quantitative Posttest Results HOMEWORK SCORES INQUIRE 81 ABLATED INQUIRE 74 PAPER BOOK 71 QUIZ SCORES INQUIRE 88 ABLATED INQUIRE 75 PAPER BOOK 81 P-value from 2 tailed t-tests: HW Scores Full versus Ablated: 0.12 Full versus Textbook: 0.02 Ablated versus Text: 0.52 QUIZ SCORES Quiz Scores Full versus Ablated: 0.002 Full versus Textbook: 0.05 Ablated versus Text: 0.18 Note: Table made from bar graph.
|Printer friendly Cite/link Email Feedback|
|Author:||Chaudhri, Vinay K.; Cheng, Britte Haugan; Overholtzer, Adam; Roschelle, Jeremy; Spaulding, Aaron; Cl|
|Date:||Sep 22, 2013|
|Previous Article:||Recent advances in conversational intelligent tutoring systems.|
|Next Article:||Online reconfigurable machines.|