Printer Friendly

Nursing students' self-assessment of their simulation experiences.

ABSTRACT This article describes a self-evaluation and feedback strategy used by nursing students and simulation faculty in a junior-level adult acute care course. Simulations are developed and implemented with the intention of furthering students' clinical judgment skills. A clinical judgment rubric, based on the Tanner Model of Clinical Judgment, is used as a self-assessment tool. The rubric describes the development of clinical judgment over four levels and is scored by students as they reflect on their practice. In addition to using the rubric's descriptors to rate themselves (Beginning, Developing, Accomplished, and Exemplary), the students apply an evidence-based process, citing simulation examples of their clinical thinking as support for their ratings. Simulation faculty respond to the postings, affirming students' observations or helping them experience a different perspective, and offer help to move toward the next stage of clinical judgment development. The postings offer clinical faculty insight into clinical judgment processes observed in the practicum settings.

Key Words Self-Assessment--Simulation--Clinical Judgment Rubric-Clinical Learning


ONE GOAL OF SIMULATION IS THE DEVELOPMENT OF CLINICAL JUDGMENT, OR THE ABILITY TO THINK LIKE A NURSE. Tanner (2006) analyzed and described this complex process in a model of clinical judgment, defining clinical judgment as "an interpretation or conclusion about a patient's needs, concerns or health problems, and/or the decision to take action (or not), to use or modify standard approaches, or to improvise new ones as deemed appropriate by the patient's response" (p. 204). * Simulation offers excellent opportunities for students and faculty to examine students' clinical judgment and help develop their thinking. While some students interact in a simulation scenario, others watch the action. Then, a full group debriefing follows, involving faculty and peers. * While this strategy is designed to foster students' development of clinical judgment, it has been the authors' goal to make the connection between simulation and clinical judgment more transparent. This article describes how simulation faculty introduced and helped students use the Lasater Clinical Judgment Rubric, a theoretical and empirically grounded assessment tool for simulation, as a personal, reflective, self-assessment tool (Lasater, 2007a). * Students and faculty now have a language in common as well as a means for assessing students' progress in their clinical judgment development and goal setting. The self-assessments are additional evidence of clinical thinking for faculty who supervise students in the clinical setting to better support student learning, development, and evaluation.

The Dilemma Nursing students have traditionally spent clinical time in various health care facilities, providing care to patients under the supervision of a clinical instructor and/or preceptor. Clinical faculty may find it difficult to evaluate students' clinical judgment skills when they are supervising multiple students in the same setting. With the availability of increasingly realistic patient simulators and faced with a shortage of clinical sites as well as faculty, nursing programs are utilizing new avenues of clinical learning, such as manikin-based simulation.

Since 2004, junior nursing students in a clinical adult acute care course have participated in manikin-based simulation as an integrated clinical learning activity. The amount of simulation per course has varied, from two to eight four-hour sessions each term, depending upon experiences available in clinical sites, course objectives, and learning needs. Students in the course attend their four-hour simulation sessions in groups of 12. Each session consists of four scenarios, or patient cases, during which a team of three students provides care for the patient while the remaining nine students observe via live video broadcast from a debriefing room. Two faculty, serving as facilitators, remain consistent throughout the course. They develop the scenarios, run the equipment, direct the case, and facilitate the debriefing. The scenarios are written to complement the content the students are learning in lectures and skills labs; for instance, after a lecture on the endocrine system and an intravenous fluid and pump lab, a scenario event may be based upon a diabetic patient on an insulin drip who requires a focused assessment and a particular action.

Debriefing that follows each scenario focuses on the team's care of the patient, including safe practice, priority setting, continuous assessment, communication, resource management, and leadership. Faculty facilitate debriefings with specific course competencies in mind and lead the group, comprised of scenario participants and observers, toward discussion of clinical judgments in each case.

Despite having extensive discussions surrounding the cases, students have requested more personal evaluations and feedback regarding their actions in simulation (Lasater, 2007b). The dedicated simulation faculty also needed a better way to provide evidence of clinical thinking and provide feedback to faculty who supervised students in other clinical settings. Thus, a tool was required to help students reflect on their experiences in simulation, assess their progress in developing their clinical judgment, and provide feedback and guidance toward the attainment of higher levels of thinking.

Literature Review Reflective journaling is often described in the nursing literature (Craft, 2005; Lasater & Nielsen, 2009; Murphy, 2004; Nielsen, Stragnell, & Jester, 2007), but written self-assessment strategies rarely appear. However, self-assessment or self report strategies in education are routinely evaluated and reported. One review in particular strongly suggested that self-directedness in students is enhanced when students have opportunities and structures by which to self-assess, that is, set goals and monitor their progress toward them (Nicol & Macfarlane-Dick, 2006). The authors cited a study by Boud (1986) that concluded that students should be engaged in identifying and working with standards that will pertain to their work and use those standards to assess their work. These findings are congruent with the use of a rubric for formative self-assessment.

Likewise, Fitzpatrick (2006) found that self-assessment had a positive impact on students' personal and professional development, especially their sense of autonomy and thinking skills. These conclusions were further supported by a study that examined enhanced self-directed learning as an outcome of self-assessment (Maclellan & Soden, 2006). In addition, self-assessment, which is closely related to reflection (Nicol & Macfarlane-Dick, 2006; White, Crooks, & Melton, 2002), can help students develop plans for personal growth, an important professional skill. When shared, self-assessment offers faculty the opportunity to observe students' thinking (Davies, 2002). Marienau (1999) concluded that self-assessment furthers learning from experience, more effective functioning, and commitment to competency, self-agency, and authority. Manikin-based simulation certainly offers the potential for such learning.

The Reflective Self-Assessment Activity The Clinical Judgment Rubric (Lasater, 2007a) was selected as a means to provide students with more personalized feedback. The rubric's language and developmental progression has greatly enhanced the sharing of thoughts and ideas among groups of students as well as simulation and clinical faculty. Students use it for personal self-assessment and as a feedback mechanism for themselves and for supervising clinical faculty.

In instituting use of the rubric, a clinical judgment scoring sheet was posted on an online learning management system along with a description of the behaviors at each level (Exemplary, Accomplished, Developing, Beginning) for the four phases of Tanners Clinical Judgment Model (2006) (Noticing, Interpreting, Responding, and Reflecting). (See Table 1.) The rubric provided a framework for students for organizing their thoughts about managing various patient situations. Students completed a scoring sheet, based on their own self-reflection of their practice in simulation, using specific examples of the scenarios in which they participated, as well as their contributions in debriefing.

When the rubric was first used, students completed their self-assessments twice, at midterm and at the end of the term. Simulation faculty read them and gave online feedback on the students' thoughts. The completed documents were then available to the students and clinical faculty. Although student performance in simulation was not graded, clinical faculty found the information useful as further confirmation of students' progress in the clinical areas. In addition, the process of reflecting on clinical judgments by students fostered their development and expertise (Lasater, 2007b).

Due to the number of students in the course (48 per term), certain measures were necessary to provide individualized feedback. These included consistency of simulation faculty throughout the term, to facilitate continuous observation of students in the simulation environment; familiarity of simulation faculty with the course competencies, the scenarios, and their objectives; and contact with clinical faculty regarding specific student issues. Table 2 offers a sample of students' self-assessments about their performance in simulation using the Lasater Clinical Judgment Rubric.

When the rubric was first used, students and faculty expressed satisfaction with the feedback process. However, simulation faculty found two times per term to be impractical for thorough, individualized review and comments on each student's self-assessment. To make the process more manageable and continue to provide high quality of personalized feedback, students now do self-assessments once per term in selected courses.

Significance for Nursing Education The ability of students to engage in self-reflection and share their clinical thinking about their practice is evident in their submissions. The majority of students show an ability to think deeply about the situations they encounter in simulation, analyze the patient events and their responses, and apply their experiences to their broader knowledge of nursing and the clinical judgment required to practice safely and effectively.

The tool has proved to be a useful method for clinical and simulation faculty to review as they plan learning activities for students; requiring specific examples offers additional insight into students' thinking. Often, the students' self-assessments parallel the observations of the supervising clinical faculty. Reading the score sheets offers evidence of students' thinking that enables faculty to select appropriate patients, provide supervision at the clinical sites, and observe students more closely when necessary. The Lasater Clinical Judgment Rubric (2007a) offers students the language needed to describe their progress. The four phases of Tanner's Clinical Judgment Model provide a framework for students to organize their thoughts about managing various patient situations. Very specific examples from the scenarios are described, indicating that students have encountered valid learning experiences in simulation. Their reflections are often deeper and more significant than what they discussed in debriefing, demonstrating that their reflective thinking about the scenarios continues, days and weeks beyond the actual simulation experience.


Boud, D. (1986). Implementing student self-assessment, Sydney, Australia: Higher Education Research and Development Society of Australia.

Craft, M. (2005). Reflective writing and nursing education. Journal of Nursing Education, 44(2), 53-57.

Davies, P. (2002). Using student reflective self-assessment for awarding degree classifications. Innovations in Education and Teaching International, 39(4), 307-319.

Fitzpatrick, J. (2006). An evaluative case study of the dilemmas experienced in designing a self-assessment strategy for community nursing students. Assessment & Evaluation in Higher Education, 31 (1), 37-53.

Lasater, K. (2007a). Clinical judgment development: Using simulation to create an assessment rubric. Journal of Nursing Education, 46(11), 496-503.

Lasater, K. (2007b). High fidelity simulation and the development of clinical judgment: Student experiences. Journal of Nursing Education, 46(6), 269-276.

Lasater, K., & Nielsen, A. (2009). Reflective journaling for clinical judgment development. Journal of Nursing Education, 48(1), 40-44.

Maclellan, E., & Soden, R. (2006). Facilitating self-regulation in higher education through self-report. Learning Environments Research, 9, 95-110.

Marienau, C. (1999). Self-assessment at work: Outcomes of adult learners' reflections on practice. Adult Education Quarterly, 49(3), 135-146.

Murphy, J. I. (2004). Using focused reflection and articulation to promote clinical reasoning. Nursing Education Perspectives, 25, 226-231.

Nicol, D.J., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218.

Nielsen, A., Stragnell, S., & Jester, P. (2007). Guide for reflection using the clinical judgment model. Journal of Nursing Education, 46(11), 513-516).

Seropian, M.A., Brown, K., Gavilanes, J. S., & Driggers. B. (2004). High fidelity simulation: Not just a manikin. Journal of Nursing Education, 43(4), 164-169,

Tanner, C.A. (2005). What have we learned about critical thinking in nursing? Journal of Nursing Education, 44(2), 47-48.

Tanner, C.A. (2006). Thinking like a nurse: A research-based model of clinical judgment. Journal of Nursing Education, 45(6), 204-211.

White, D. R., Crooks, S. M., & Melton, J. K. (2002). Design dynamics of a leadership assessment academy: Principal self-assessment using research and technology. Journal of Personnel Evaluation in Education, 16(1), 45-61.

Mary L. Cato, MSN, RN, an instructor and lead simulation specialist, Simulation and Clinical Learning Center, Oregon Health & Science University in Portland, is one of the expert authors of the NLN/Laerdal Simulation Innovation Resource Center (SIRC). Kathie Lasater, EdD, RN, ANEF, is an assistant professor and served in 2007-2008 as interim statewide director of simulation learning, Oregon Health & Science University. Alyeia Isabella Peeples, BSN, RN, is an intensive care unit nurse and simulation specialist, Oregon Health & Science University Hospital. For more information, write to Ms. Cato at
Table 1. Lasater Clinical Judgment Rubric Scoring Sheet



* Focused Observation: E A D B
* Recognizing Deviations
 from Expected Patterns: E A D B
* Information Seeking: E A D B


* Prioritizing Data: E A D B
* Making Sense of Data: E A D B


* Calm, Confident Manner: E A D B
* Clear Communication: E A D B
* Well-Planned Intervention/
 Flexibility: E A D B
* Being Skillful: E A D B


* Evaluation/Self-Analysis: E A D B
* Commitment to Improvement: E A D B


(c) Copyright 2007 by Kathie Lasater

Table 2. Examples of Student Self-Assessment of Clinical Judgment
Development Through Simulation

 Clinical Judgment Dimension Self-Assessment

Student A Effective Noticing Beginning
 * Focused Observation
 * Recognizing Deviations
 from Expected Patterns
 * Information Seeking

Student B Effective Interpreting Developing
 * Making Sense of Data
 * Prioritizing Data

Student C Effective Responding Developing
 * Calm, Confident Manner
 * Clear Communication
 * Well-Planned Intervention/
 * Being Skillful

Student D Effective Reflecting Accomplished
 * Evaluation/ Self-Analysis
 * Commitment to Improvement

 Evidence for Student Assessment

Student A "With the patient, I sometimes get 'lost' in the
 assessment. More specifically, I know what I see but
 don't know what to do with the information. When I
 was doing the assessment for Juanita, my mind simply
 went blank about what pertinent questions to ask of
 her relative to her COPD exacerbation. I guess I
 was 'expecting' her to allow me to proceed with an
 uninterrupted head-to-toe assessment (which clearly
 was not a reasonable expectation). She kept asking
 me questions and was unable to breathe easily. This,
 of course, threw me off so that my assessment was not
 finished appropriately, and I got sidetracked with
 ensuring that she was getting enough O2. Essentially,
 I think I just got overwhelmed by the situation and
 My lack of understanding of the big picture. I also
 neglected to carry out information-seeking opportunities
 with my teammates in order to troubleshoot more

Student B "I learned so much from this scenario about priorities
 because I had them completely wrong. The priority here
 was getting the consent form signed so that he could go
 to surgery--his pain was important, but I delayed him
 getting treatment for his condition by giving him
 morphine and making it impossible for him to sign the
 consent form. I was so focused on the skill of drawing
 up the medication that I completely lost sight of the
 big picture. I realized later that I am not doing the
 patient or my team members a favor by tuning out of the
 situation. I really tried to participate and actively
 interpret data in the next two simulations based on what
 I learned. In this aspect only, I felt really proud of
 myself and excited that I was able to assess and make
 sense out of what was going on with my patients."

Student C "Being in the leader's role in this scenario, I realized
 that in a stressful situation I get very tunnel visioned
 and start to feel that it is solely my responsibility to
 manage what is going on. I remember thinking that I
 needed help. Once I got the phone orders, my thinking
 became very narrow. I told my team that we needed to
 give her meds, but I became completely stressed and
 disorganized. Later, I realized that when someone is
 having an anaphylactic reaction, the calm, confident,
 and effective leader would have opened up the door and
 yelled for help instead of thinking that it was solely
 up to me to administer 4 medications. I know now that
 part of being calm and confident is knowing that you
 are not really all alone--you have lots of help and you
 can make huge mistakes when you don't use the resources
 that are available to you."

Student D I spend a great deal of time outside of the simulation
 reflecting on my performance, what I did poorly, what I
 felt confident about, and how to improve my performance
 and knowledge each time. At times, I am overly critical;
 I have high expectations of myself and get frustrated.
 But I also know that becoming a good nurse is an ongoing
 learning process. There will always be things to learn
 and situations that challenge me. I recognize that I
 also need to push and challenge myself in the learning
 process so that I can constantly grow in my skill
 proficiency and confidence. I find the debriefings very
 helpful for my learning in that I can verbalize my
 perceived shortcomings and learn from them as a result
 of the group's responses and observations.
COPYRIGHT 2009 National League for Nursing, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2009 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Cato, Mary L.; Lasater, Kathie; Peeples, Alycia Isabella
Publication:Nursing Education Perspectives
Article Type:Report
Geographic Code:1USA
Date:Mar 1, 2009
Previous Article:A collaborative project to apply and evaluate the clinical judgment model through simulation.
Next Article:The essentials of debriefing in simulation learning: a concept analysis.

Related Articles
Clinically high tech: clinical simulation centers help to prepare students for nursing careers.
The case for group planning in human patient simulation.
Student satisfaction with high-fidelity simulation: does it correlate with learning styles?
The essentials of debriefing in simulation learning: a concept analysis.
Creating simulation communities of practice: an international perspective.
Creative clinical solutions: aligning simulation with authentic clinical experiences.
Nursing Education 2.0: a second look at Second Life.
Evaluating first-year nursing students' ability to self-assess psychomotor skills using videotape.
Clinical evaluation and grading practices in schools of nursing: national survey findings part II.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters