Printer Friendly

Why Assessment and Faculty Development Need Each Other: Notes on Using Evidence to Improve Student Learning.

Colleges and universities collect a great deal of data about student learning and experiences but many do very little with it. At best, the data is shared with administrators and faculty. At worst, it sits on a dusty bookshelf or hidden in electronic folders in an assessment office. But, chances are it's not being used as evidence to inform the changes that lead to improved student learning. In a recent survey of institutional leaders the top reported needs and supports for student learning outcomes assessment are 1.) more faculty using the results of the student learning assessment and 2.) more professional development for faculty and staff (Janowski, Timmer, Kinzie, & Kuh, 2018).

Assessment Efforts Aimed at Improvement Need Faculty and Faculty Development

Faculty members are the crucial element to improving student learning and experiences; and faculty programs for teaching and learning are ideally situated to connect faculty members already concerned about improving student learning with assessment efforts. Educational developers (a.k.a. faculty developers or faculty who run programs focused on improving teaching and learning) understand how to structure discussions and workshops that engage faculty members with evidence and help them improve their teaching and, ultimately, student learning.

Faculty ultimately have influence over the educational experiences of students both on a micro level (assignment and course design, approaches to teaching) as well as on a more global level (the design of majors and the overall curriculum, how students are advised and supported, influence over how an institution's educational resources are allocated). Simply put, in order for the data that many assessment offices or institutional researchers gather to have an effect on a school's education, faculty members need to be involved. Working together, educational developers and assessment professionals can support and guide faculty in their efforts, helping to shape assessment questions that are meaningful and use the findings to inform improvement efforts.

Ideally, as a school considers assessment and data-gathering efforts, faculty members will be involved from the start. Because faculty have the most intimate knowledge of curriculum, courses, assignments, teaching, and learning, they should help design assessments that best target the needs of an institution. Assessment findings should inform decisions about where to target curricular improvements, course re-designs, academic support resources, and more effective teaching approaches. Additionally, faculty input about such initiatives is likely to increase their ownership of assessment efforts, with both the gathering of assessment evidence and, hopefully, the eventual learning improvement informed by that evidence. Faculty involvement allows those who are most directly impacting student learning to feel more confident helping in the assessment process, enabling them to better make sense of the data and use it as evidence, as well as to think more clearly about how to align resources and improvement efforts.

Faculty Development Benefits from a Strong Partnership with Assessment Professionals

Perhaps now more than ever, educational development emphasizes evidence-informed practices based on research (e.g., see Ambrose, Bridges, DiPietro, Lovett, & Norman, 2010; Beach, Sorcinelli, Austin, & Rivard, 2016)1. But faculty members may wonder: "What about the significance of these ideas for our students? In our institutional context? What are my own students' experiences?" Local evidence--and the stories it tells about our students and their experiences--can be extremely compelling for faculty members. Institutional assessment data has the potential to provide that powerful evidence. Some of the teaching and learning questions that local assessment evidence can help answer include:

* Are our students learning XXXX?

* How do our students experience our classrooms? Our curriculum? Our campus?

* Do all of our students feel well supported?

* What type of students struggle where?

* What factors influence student success, retention, and completion?

* Who chooses what majors and programs and why?

* What are the unintentional roadblocks in a major or our overall curriculum?

* How many and which students engage in the various educational high-impact experiences that an institution has to offer? Are these experiences high quality and equitably distributed?

* How does our institution compare with peer institutions in terms of student engagement and experiences that lead to student learning?

Some of the most compelling evidence on both of our campuses has come from extensive focus groups with students. Ironically, even at successful faculty centers for teaching and learning faculty members too rarely sit down with students and talk openly about learning and what happens in our classrooms, labs, and studios.

The opportunities for faculty development programs to incorporate data gathered by the assessment or institutional research office is tremendous--and that information can inform individual faculty practice related to course design and teaching, as well the overall design of the curriculum, targeting academic support, a school's overall strategic priorities, and the allocation of resources in general. A coordinated effort between the assessment and faculty development offices will allow institutions to triangulate assessment findings, student learning, and faculty practices.

Overcoming Potential Barriers to a Partnership between Assessment Efforts and Faculty Teaching & Learning

While the proposition to connect assessment and faculty development seems relatively simple, some obstacles to faculty development/assessment partnerships may need to be overcome. Assessment leaders and faculty developers often think of assessment in different ways, perhaps even as having different purposes. Institutional researchers may focus on how assessment is documented and used for accountability or they haven't enough teaching experience to fully empathize with just how messy teaching, learning, and assessing can be. Furthermore, many assessment professionals have no training in curriculum, pedagogy, or group facilitation.

Faculty developers, on the other hand, may not understand the exigencies or intricacies of assessment and data, particularly in regard to documenting for accountability. In addition, they are less likely to have training in research design, measurement, and statistics. Faculty and faculty developers may see assessment as a necessary and ongoing piece of improving teaching and learning but may not document it in a designated format for an outside audience. Beyond overcoming the challenge of separate spaces and administrative units that may hinder holding frequent conversations, we strongly believe that the key players need to reach out and learn more about what each group has to offer the other.

If colleges endeavor to form a tighter relationship between assessment and faculty development in the spirit of improvement, we offer these three broad suggestions:

1. Talk with Each Other

Take your assessment professional or institutional researcher or the person in charge of faculty development out to lunch. Begin to open a dialogue. A few questions to get you started include:

For faculty developers to ask assessment and institutional research professionals:

* What data about student learning and experiences do you have that you find most interesting?

* Based on our college's data, what areas do you see for potential improvement? What are we doing well? What areas do you see for further exploration with faculty?

* How can I help share our data with the people who should be seeing it, and how can we help them make sense of that data?

For assessment and institutional researchers to ask faculty developers:

* What are your current initiatives? What is your programming focusing on? What events or discussions do you have upcoming this term?

* What areas are of most concern to you and the faculty members with whom you work?

* Based on your perspective, what are we doing well and what areas do you see for potential improvement in student learning and experiences?

* How can I help you advance your agenda and initiatives?

* Schedule regular get-togethers for informal discussions between assessment and faculty development leaders. For example, regularly share a coffee or take a walk together.

* At the conclusion of each semester make a date to examine assessment data jointly. Make note of links between institutional-level data, student use of academic support resources, and academic program findings, as well as how these trends correlate with the topics faculty say they find most challenging about student learning.

2. Co-Sponsor Events that Engage Faculty with Evidence You Already Have

* Never simply share reports. Data requires a process of "collective meaning making." It is the starting point for "a process of inquiry" (Reder, 2014). Data is just a collection of information--it cannot serve as evidence until faculty and administrators create a narrative about its significance. Only then can an institution take evidence-informed actions that can begin to improve student learning.

* Take every opportunity to share possible interpretations with faculty and co-facilitate discussions to help faculty to make meaning of the assessment findings. For example, share the National Survey of Student Engagement snapshot reports at departmental and faculty senate committee meetings or hold brown bag forums in each academic building. Ask faculty: What do you think about this data? How does it connect with your experiences and findings? What questions do you have about our students' experiences? What would you like to know more about? What actions might we take in response to this data?

* Co-sponsor faculty development workshops that incorporate an opportunity to examine and discuss selected data. Work data into sessions about assignment design, converting courses to online formats, or developing flipped classrooms. Invite the tutoring center staff to these workshops so they can target their academic support resources to the needs revealed by the data and faculty.

* Formally showcase examples of successful teaching and learning projects in a Scholarship of Teaching and Learning format. For example, cosponsor a newsletter, website, poster session, or panel discussion that includes the assessment findings as part of the narrative.

3. Form Intentional Partnerships and Begin to Collaborate More Widely

* Collaborate with faculty leaders to select and co-sponsor evidence-informed learning improvement projects that are easily embedded into existing or upcoming teaching and learning initiatives. For example, cosponsor a faculty learning community, retreat, or community of inquiry around such projects.

* Before a faculty development event takes place, contact your assessment professionals to see if they have any data that will lend insight into the topic being discussed.

* During assessment training sessions build in time to discuss teaching and learning. For example, informally showcase and discuss successful teaching approaches on your campus and foster a discussion around how to address teaching challenges.

* Co-sponsor learning improvement grants to provide resources for designing new curriculum, course design, and teaching approaches where they are most needed.

Partnership between assessment and faculty development is not simply an enhancement to each of these efforts--it is essential for developing actions that will truly succeed in improving student learning where it is most needed. Improved student learning depends upon it.

References

Ambrose, S., Bridges, M., DiPietro, M., Lovett, M., & Norman, M. (2010). How learning works: Research-based principles for smart teaching. San Francisco, CA: Jossey-Bass.

Beach, A.L., Sorcinelli, M.D., Austin, A.E., & Rivard, J. (2016). Faculty development in the age of evidence: Current practices, future imperatives. Sterling, VA: Stylus.

Jankowski, N. A., Timmer, J. D., Kinzie, J., & Kuh, G. D. (2018, January). Assessment that matters: Trending toward practices that document authentic student learning. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment (NILOA).

Reder, M. (2014). Supporting teaching and learning at small colleges--Past, present, and future: A message from the guest editor. Journal on Centers for Teaching and Learning, 6, 1-11.

AUTHORS

Michael Reder, Ph.D.

Connecticut College

Cynthia Crimmins, M.S.Ed.

York College of Pennsylvania

CORRESPONDENCE

Email reder@conncoll.edu

(1) To inform and improve their own work educational developers have drawn upon the neuropsychology of learning (including a better understanding of growth mindset and the role of metacognition in learning), theories about the affective and sociological aspects of learning (including theories of motivation, approaches to mitigating stereotype threat, and removing unintentional roadblocks in learning), and evidence from national studies (such as the National Survey of Student Engagement and the Wabash National Study) about best teaching practices. Such evidence informs much of the programming that faculty centers for teaching and learning offer; and while many faculty find these evidence-informed practices persuasive they often lack a local institutional significance.
COPYRIGHT 2018 Research & Practice in Assessment
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Reder, Michael; Crimmins, Cynthia
Publication:Research & Practice in Assessment
Date:Dec 22, 2018
Words:1970
Previous Article:The More Things Change, the More They Stay the Same: New Challenges to Using Evidence to Improve Student Learning.
Next Article:Considerations and Resources for the Learning Improvement Facilitator.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters