Printer Friendly

The AP descriptive chemistry question: student errors.

 For over a decade, the authors have been involved in a design
 theory experiment providing software for high school students
 preparing for the descriptive question on the Advanced
 Placement (AP) chemistry examination. Since 1997, the software
 has been available as a Web site offering repeatable practice.
 This study describes a 4-year project during which incorrect
 responses using the most recent interface were collected and
 saved in a dataset. A descriptive analysis suggests that the
 most basic errors include: failing to write appropriate
 chemical formulas, recognize reactive species in net ionic
 reactions, and recognize weak electrolytes. The largest single
 group of reported errors, recognizing weak electrolytes,
 implies strategies for improved student performance on the AP
 examination. In addition, the results of this study support
 the development of greatly enhanced feedback for future
 learner who use the site.

For virtually every year since its inception, the advanced placement (AP) chemistry examination has included an item generally called the "descriptive chemistry question." An example of such a question would be a statement such as:

Methane is burned in an excess of oxygen

An accepted answer from the student is:

C[H.sub.4] + [O.sub.2] --> C[O.sub.2] + [H.sub.2]O

The answer accepted by the graders of the AP exam is a net ionic representation that is not necessarily balanced. Each student is given eight such statements and expected to select and respond to five. Traditionally, each item is scored using a three-point scale where one point is earned for providing the formula for the correct reactant(s) and two points are earned for the correct product(s).

Since the early 1990s, our research has focused on software development aimed at increasing knowledge of descriptive chemistry and enhancing student performance on the AP chemistry examination. Our current software is available on a Web site for both teachers and students; for students it provides an opportunity for repetitive practice.

This study analyzed 4 years of student errors collected from the Web site using the most current interface and saved in a dataset. Some errors are typographical; these appear rarely--at most a few times out of the thousands of incorrect responses measured. Patterns of student errors may reflect deeply held misconceptions. In addition to our analysis, teachers are provided with access to our database of errors so that they can use it to focus their instruction.

Supporting Literature

Our research with the AP descriptive chemistry Web site is grounded in literature concerning the use of Web-based repetitive practice for teaching and learning. Although use of the Web for teaching/learning is becoming very common, empirical support for specific strategies is lacking. Much is known about the effectiveness of repetitive practice with feedback, however.

Using the Web as a teaching tool appears to be as effective as traditional classroom teaching. In experimental studies, Web-based instruction consistently produces no significant difference in test scores when these scores are compared to traditional test scores (Wegner, Holloway, & Garton, 1999). Web instruction does seem to impact student impressions of the course in positive ways, however (Brown, 1995; Wegner et al., 1999).

By nature, computers are ideal for providing students with practice scenarios and problems. Computers can provide problems, evaluate responses to those problems, and provide immediate feedback. Providing appropriate feedback is known to increase the rate of learning (Lhyle & Kulhavy, 1987). In fact, effective practice must occur with feedback (Bransford, Brown, & Cocking, 1999). Feedback that explains how answers are determined and provides further instruction is most effective (Pressley & McCormick, 1995; Renkl, 1998). Engaging in practice and receiving feedback makes for better learning by promoting self-regulation (Butler & Winne, 1995). Pintrich describes self-regulated learning as "involving the active, goal-directed, self-control of behavior, motivation and cognition for academic tasks by individual students" (Pintrich, 1995).

While practice typically implies homework or some type of classroom work, testing also can serve as practice. Practice tests and quizzes not only prepare students for the type and variety of questions on a graded exam, they also can serve as a teaching and learning tool for course content (Foos & Fisher, 1988).

The Web facilitates the delivery of practice and supports distance learning. We conjecture that the success of Web practice as a learning tool hinges on providing effective feedback (Brooks, Schraw, & Crippen, 2005). The nature of the feedback ensures that students not only identify what they do not know, but have the opportunity to reflect and acquire missing knowledge.

Using the Web to grade homework is a very practical application of the technology. Online homework systems have been shown to have a positive effect on test scores for both physics and chemistry students (Dufresne, Mestre, Hart, & Rath, 2002; Penn, Nedeff, & Gozdzik, 2000). This is especially so when the exam questions require the use of memorized concepts or the use of simple problem-solving algorithms (Paull, Jacob, & Herrick, 1999).

Parallel to Web-based homework systems are Web-based practice systems where students can take practice quizzes on their own volition. While these practice exams do not affect a student's grade, early research suggests that students who utilize such practice score higher on actual exams than do students who use only traditional study methods (Hall, Pilant, & Strader, 1999). Charlesworth (2000) reports a direct correlation between online practice quiz scores and traditional exam scores for general chemistry students.


Our research on supporting the AP descriptive chemistry question has focused on software development using an instructional design perspective. We use a design theory methodology that emphasizes an ongoing repetitive cycle of design, testing, data collection, and analysis (Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003). Consistent with design theory, our analysis and development cycles rely heavily on retrospective analysis.

A decade ago, we developed a desktop software tool for teachers that allowed them to generate sets of eight items that would be quite reminiscent of the AP question. That is, items were grouped and selections were made using random choices from within those groups such that the distribution of items reflected a typical AP question. Teachers could print examples of eight-item questions and the related answer keys.

Since 1997, our original software has been available as a Web site offering repetitive practice. Annually, students and teachers use this system extensively (Crippen, Brooks, & Abuloum, 2000). Prior to 2000, the site offered the same functionality as the desktop version of the software; upon request, it randomly built eight-item quizzes that were similar in format to the AP exam. Using results from early studies, we produced a major revision of the Web site in 2000 (Brooks & Crippen, 2001; Crippen & Brooks, 2001, 2002, 2004). The current interface permits students either to engage in AP exam practice (answering eight items distributed from among the groups) or to focus on learning the content by practicing with individual items (responding to one item at a time until correct).

In addition to building and grading quizzes and providing feedback to students, the most recent version of our practice Web site has been capturing student responses for each item attempt. Theoretically, the responses were chemical formulas for both products and reactants for several categories of chemical reactions.


Descriptive Statistics: During the period January 9, 2001 through November 12, 2004, 5,304 users created records. Of these, 3,446 (65%) accepted the invitation to permit study of their performance (University of Nebraska - Lincoln IRB #99-11-068FB). The self-identification of these users is displayed in Table 1.

To develop forensics related to potential hacking, each interaction was noted whether or not the user accepted our terms. The average number of discrete interactions with the Web site of those accepting participation in the study was 52, while that of those declining participation was 45. We have no information about those declining participation in the study other than that they logged in.

The data for AP high school students and teachers was examined in more detail. Some self-descriptions of the AP students appear in Table 2.

A most striking outcome of these reports is that so few of the students are 19 or over: almost 1,700 students (90%) report themselves as being members of a 'protected' group for such studies. While more students used the site from home than from school, the school-based usage is reported as a part of classroom work. Of the top 1,000 users in the student group, the average number of uses was 132. Of the top 100 users, that number was 525. The top 10 AP student-users made an average of 1,110 uses of the site, a very high level of practice.

Supported by previous results (Brooks & Crippen, 2001), our latest design sought to encourage practice on individual items instead of taking a typical eight-item AP questions. Of the 1,895 students, only 402 (21%) ever requested a full set of eight items. While one user received 322 eight-item questions; the next largest number was 95 and only 6 users took more than 50. When ordered by the number of items received, the 100th user listed took 6 full AP questions. Thus, there was dramatic use of the item-by-item practice strategy and seemingly less emphasis on holistic practice for the AP examination.

Student use of computer help systems is not supported empirically (Alevin, Stahl, Schworm, Fischer & Wallace, 2003). We interpret these findings as an issue of design. The results provided here contradict the trend and show that students do use our help system. In our system, 998 AP student users (53%) did use tutoring at least once. Some 567 AP students (30%) devoted at least 5% of their user interactions to accessing the tutors.

In support of using our materials in a traditional classroom setting, 473 self-reported AP teachers used the site with an average of 26 uses each. Of these, 307 teachers (65%) reported using the site at school.

Incorrect Responses: In interacting with the site, the 5,304 users produced over 135,000 incorrect responses in responding to 227 items. When creating AP-like test questions consisting of eight items, a sampling procedure was used such that randomized items were selected from within groups, making each selection typical of eight-item AP questions. For practice, however, the items were sequenced in groups and each user was brought through the group sequence one item at a time. Because users made such extensive use of practice items, items that appeared early in the group's queue received a disproportionate number of responses than those coming later. It is inappropriate, therefore, to try to analyze for the 'most missed' item or the 'most difficult item from the combustion group.'

An incorrect formula for any combination of reactant or product for a given reaction would identify a submission as an incorrect response. These incorrect response submissions were extracted and disaggregated by reaction type. The error submissions were then analyzed descriptively by counting occurrences and normalizing using a percentage. Using an emergent set of categories, the authors then applied their combined experience as chemistry teachers to diagnose potential sources for the most common errors. Selected results of user inputs are shown in Table 3.


It would come as no surprise to experienced teachers that some error patterns occur frequently.

Students write incorrect formulas. This occurs frequently because they have not learned charges of ionic species. For the item, "50.0 mL of 0.100 M sodium carbonate is added to 50.0 mL of 0.100 M hydrochloric acid," student responses of NaC[O.sub.3] and C[O.sub.3] together accounted for 11% of the errors. For the reaction, "solid lithium oxide is added to distilled water," student failure to learn charges, as reflected by writing LiO as a reactant, was very common (nearly 20% of errors).

Students fail to invoke dissociation when they should and do invoke dissociation when they should not. For the item, "50.0 mL of 0.100 M sodium carbonate is added to 50.0 mL of 0.100 M hydrochloric acid," responses such as N[a.sub.2]C[O.sub.3] and NaCl accounted for 12% of the errors. For the reaction, "the gases ammonia and hydrogen chloride are mixed," the most frequent errors involved writing product ions such as N[H.sub.4.sup.+] instead of N[H.sub.4]Cl (27%).

Students do not recognize weak electrolytes, especially weak acids. This is a common error. For the item "aqueous ammonia is added to hydrofluoric acid," the most frequent error (21%) involved writing [H.sup.+] as a reactant, implying failure to recognize that HF is a weak electrolyte. For the item, "aqueous potassium dichromate is added to acidified aqueous sodium sulfite," the high frequency of S[O.sub.3.sup.2-] as an erroneous response (12%) suggests failure to identify a weak acid (HS[O.sub.3.sup.-]).

Including spectator ions (those not involve in the net ionic reaction) also is common. For the item, "chlorine gas is bubbled into cold, dilute sodium hydroxide," the species N[a.sup.+] was used as a reactant in 9% of reported errors.

Misconceptions are obvious. Common errors include: Writing CuO instead of Cu as a product for the item, "solid copper(II) sulfide is heated strongly in oxygen gas" (28%); selecting Ni(N[H.sub.3][).sub.4.sup.2+] instead of Ni(N[H.sub.3][).sub.6.sup.2+] for the item, "excess 15 M ammonia is added to aqueous nickel(II) sulfate" (24%); and writing SiF instead of Si[F.sub.4] as a product in "hydrogen fluoride gas is passed over moist silicon dioxide" (15%).

Certain typographical errors were predictable and commonplace. H20 (h-two-zero) was typed instead of H2O (h-two-oh) as a product in 28% of the errors for "aqueous hydrogen peroxide is decomposed using a catalyst." Errors of this type would not be problematic in the essay portion of the AP test as evaluated by readers.


Perhaps it is not surprising that the most frequent errors produced at the Web site supporting the AP descriptive chemistry question revolve around the most fundamental issues involved in writing net ionic equations in chemistry. In spite of the fact that a large portion of the errors would not be surprising to general chemistry teachers, this data has once again provided the opportunity to redesign the site such that very specific feedback can be developed on an item-by-item basis. We have been encouraged by the fact the users do seek tutoring. This suggests that redesign of the tutoring section might enhance learning. Our next design revision will include an error analysis algorithm that will detect known errors and provide chemistry-specific feedback. Therefore, when a user writes [H.sup.+] and [F.sup.-] instead of HF for hydrofluoric acid, that student will receive specific feedback about HF being a weak acid.

Additional Access

Interested parties (teachers) may contact the authors to receive a password that will permit them access to an item-by-item error analysis directly from the database. The problem list is at: To receive a login id, please contact: David W. Brooks (
Table 1

Status Self-Identification of 3,446 Users

Not Reported 533
AP HS Student 1,895
Non-AP HS Student 84
College Student 140
Other Student 44
AP HS Teacher 473
Non-AP HS Teacher 107
College Teacher 57
Other Teacher 46
Chemist (not student or teacher) 19
Other 48

Total 3,446

Table 2

Self-Description of 1,895 Self-Identified AP Students


 Not Reported 73
 Male 969
 Female 853


 Not reported 167
 [greater than or equal to] 19 30
 < 19 1,698

Location of access

 Not reported 235
 Home 915
 School 745

If accessing at school, time of access

 Not reported 38
 During class 555

Table 3

Selected Errors

Emergent Item Error %
Problem Incorrect

Incorrect Solid lithium LI 20
formula oxide is added
 to distilled

Failure to 50.0 mL of 0.100 N[a.sub.2]C[O.sub.3];NaCl 12
dissociate M sodium
 carbonate is
 added to 50.0 mL
 of 0.100 M

Inappropriate The gases N[H.sub.4.sup.+] 27
dissociation ammonia and
 chloride are

Weak Aqueous ammonia [H.sup.+] 21
acid/base is added to

Including Chlorine gas is N[a.sup.+] 9
spectators bubbled into
 cold, dilute
 sodium hydroxide

Misconceptions Solid copper(II) CuO 28
 sulfide is
 heated strongly
 in oxygen gas

Typographical Aqueous hydrogen H20 (zero) 28
 peroxide is
 decomposed using
 a catalyst


Aleven, V., Stahl, E., Schworm, S., Fischer, F., & Wallace, R. (2003). Help seeking and help design in interactive learning environments. Review of Educational Research, 73, 277-320.

Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (1999). How people learn: Brain, mind, experience, and school. Washington, D.C.: National Academy Press.

Brooks, D. W., & Crippen, K. J. (2001). Learning difficult content using the Web: Strategies make a difference. Journal of Science Education and Technology, 10(4), 283-285.

Brooks, D. W., Schraw, G. P., & Crippen, K. J. (2005). Performance-related feedback: The hallmark of good instruction. Journal of Chemical Education, 82(4), 641-644.

Brown, A. (1995). Evaluation of teaching and learning processes in a computer-supported mechanical engineering course. Computers & Education, 25(1), 59-65.

Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of Educational Research, 65(3), 245-281.

Charlesworth, P. (2000, August). WebCT@MTU: Rejuvenating general chemistry. Paper presented at the Biennial Conference on Chemical Education, Ann Arbor, MI.

Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9-13.

Crippen, K. J., & Brooks, D. W. (2001). Teaching advanced placement descriptive chemistry: Suggestions from a testing Web site. The Chemical Educator, 6(5), 266-271.

Crippen, K. J., & Brooks, D. W. (2002). An analysis of student learning at a testing Web site emphasizing descriptive chemistry. Journal of Computers in Mathematics and Science Teaching, 21(2), 183-201.

Crippen, K. J., & Brooks, D. W. (2004). A descriptive analysis of a chemistry teacher's Web site. ERIC Digest, ED481938.

Crippen, K. J., Brooks, D. W., & Abuloum, A. (2000). A Web-site supporting the AP descriptive chemistry question. Journal of Chemical Education, 77, 1087-1088.

Dufresne, R., Mestre, J., Hart, D. M., & Rath, K. A. (2002). The effect of Web-based homework on test performance in large enrollment introductory physics courses. Journal of Computers in Mathematics and Science Teaching, 21(3), 229-251.

Foos, P. W., & Fisher, R. P. (1988). Using tests as learning opportunities. Journal of Educational Psychology, 80, 179-183.

Hall, R. J., Pilant, M. S., & Strader, R. A. (1999, March). The impact of Web-based instruction on performance in an applied statistics course. Paper presented at the International conference on Mathematics/Science Education and Technology, San Antonio, TX.

Lhyle, K. C., & Kulhavy, R. W. (1987). Feedback processing and error correction. Journal of Educational Psychology, 79, 320-322.

Paull, T. A., Jacob, M. J., & Herrick, R. J. (1999, November). Automated homework in electrical engineering technology. Paper presented at the American Society for Engineering Education symposium, West Lafayette, IN.

Penn, J. H., Nedeff, V. M., & Gozdzik, G. (2000). Organic chemistry and the Internet: A Web-based approach to homework and testing using the WE_LEARN system. Journal of Chemical Education, 77(2), 227-231.

Pintrich, P. R. (1995). Understanding self-regulated learning. In P. R. Pintrich (Ed.), Understanding self-regulated learning (pp. 3-12). San Francisco: Jossey-Bass.

Pressley, M., & McCormick, C. B. (1995). Advanced educational psychology for educators, researchers, and policymakers. New York: Harper Collins.

Renkl, A. (1998). Cognitive science: Learning from worked-out examples: A study on individual differences. Retrieved July 10, 2000, from

Wegner, S. B., Holloway, K. C., & Garton, E. M. (1999). The effects of Internet-based instruction on student learning. Journal of Asynchronous Learning Networks, 3(2).


University of Nevada Las Vegas



University of Nebraska-Lincoln

COPYRIGHT 2005 Association for the Advancement of Computing in Education (AACE)
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2005, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:advanced placement
Author:Brooks, David W.
Publication:Journal of Computers in Mathematics and Science Teaching
Geographic Code:1USA
Date:Dec 22, 2005
Previous Article:Using virtual reality computer models to support student understanding of astronomical concepts.
Next Article:Changing epistemology of science learning through inquiry with computer-supported collaborative learning.

Related Articles
An analysis of student learning at a testing web site emphasizing descriptive chemistry.
University-initiated strategies to increase supervisory capacity and benefits associated with dietetic student supervision--perceptions of dietetic...
Surviving a midlife crisis: advanced placement turns fifty.
Student coursetaking.
AP science goes cutting edge.
Certifying AP courses: districts bear the burden of new Advanced Placement audit requirements.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters