Printer Friendly

Tackling campus-based assessment collaboratively.

Abstract

The assessment of multi-disciplinary skills requires a campus-wide effort to succeed. In response to the State University of New York's announcement that information management skills be assessed system-wide, SUNY at Fredonia created an information literacy assessment tool, which included both pre-test and post-test components. In this article, the collaborative process used to implement the assessment tool and the resulting data are explored. The positives and negatives associated with campus-based assessment are emphasized.

Introduction

The twenty-first century brought with it a renewed interest in the assessment of general education programs at colleges and universities. The K-12 standards movement has propelled this recent trend and, consequently, both regional and state accrediting bodies are formalizing accountability systems in higher education (Wellman 47). One such example of this movement can been seen in the State University of New York (SUNY) system, which saw the implementation of new general education assessment standards arise from both their regional accreditation body, Middle States, and from the SUNY Provosts office.

Both systems of assessment have many similarities. One is the autonomy that is given to campuses. Neither plan dictates how the outcomes are obtained; rather, assessment is handled at the campus level. "In other words, it is the role of the [institution] ... to evolve its own solutions to issues of implementation by drawing on the creative energies of its faculty, staff, and other constituents" (Ratteray 369). Additionally, both revised plans added a new component to their assessment agendas: information literacy. General education programs are "designed to cultivate knowledge, skills and attitudes that all of us use and live by during most of our lives" (Stone 199). It is not surprising; therefore, that information literacy has been added to general education assessment plans. Fueled by the development of the Association of College and Research Libraries (ACRL) Information Literacy Competency Standards for Higher Education in 2000, colleges and universities are developing assessment measures that will monitor the ability of today's students to use information effectively to solve problems in a technology rich society--(Brasely 6).

Assessing information literacy is more challenging then assessing disciplines such as mathematics, because it is a skill that is multi-disciplinary by nature. Some institutions have formally implemented information literacy into their curriculum through credit-bearing courses, making it easy to draw on tests and assignments to measure student outcomes. But what happens when an institution has to measure a skill that appears everywhere throughout the curriculum, but is measured formally nowhere? This was the challenge encountered by SUNY Fredonia when laced with the assessment of information literacy for the aforementioned assessment mandated by the SUNY Provost's office. Developing, implementing and surviving the logistics of assessment is difficult. It was only through campus collaboration that successful outcomes were achieved for information literacy.

Context

In 1999, the State University of New York created the Provost's Advisory Task Force on the Assessment of Student Learning Outcomes, which outlined a University-wide assessment plan for general education. The challenge was to create a plan, which reflected and acknowledged the diversity among SUNY's 64 campuses (Francis 338). The plan identified ten discipline areas and two competency areas, including information management (which includes both information literacy and computer competencies), that would be assessed system-wide. Outcomes that each campus must report on were assigned to each area. According to the Guidelines, each campus must have their assessment methodologies approved by a SUNY assessment committee and the results are reported on a three-year cycle. The Guidelines call for an elusive 20 % sample size of a given program assessed, indicating how many students fell into the following groupings: 80% Exceeding Standards; 75-79% Meeting Standards; 70-74% Approaching Standards; Below 70% Not Meeting the Standards. Aside from these few requirements, campuses are autonomous in creating assessment measures that reflect the missions and goals of their individual institutions. Assessment structure and content would be created at the campus level (Guidelines I).

Achieving "Buy-In" for Campus-Based Success

While campus-based assessment plans offer colleges freedom, it is also often seen as extra work by campus faculty. Creating effective assessment requires a lot of extra time and energy. In his editorial, "Outcomes Assessment is Here to Stay, Get Faculty Buy In," Daniel Weinstein suggests that: "To prevent faculty from feeling overwhelmed [by assessment], utilize the good work and tools that faculty already have"(2). While this suggestion makes sense for discipline areas, how is assessment for a multi-disciplinary skill such as information management "bought into" by teaching faculty?

SUNY Fredonia turns to Daniel A. Reed Library

SUNY Fredonia achieved the exact opposite of "buy-in" when beginning their information management efforts. Faculty that formed the first information management committee quit after only a few months of service. According to author Oswald Ratteray, campuses may lack enthusiasm for anything related to information literacy because of its origins in the library support service of bibliographic instruction. In essence, faculty have "marginalized information literacy as a librarians "turf" ... [and therefore,] feel absolved of any need to take a great interest in, or responsibility for, information literacy in the curriculum" (Ratteray 369). The campus did end up turning to the library for assistance, making an instruction librarian the chair of the information management committee. The Daniel A. Reed Library's instruction department was under new leadership at this time. Faculty/librarian collaboration was in its infancy with few instances in which to draw upon for assessment purposes. Additionally, there was no credit bearing information literacy course. Even with the library involved, assessment data would need to be created from scratch. The library, however, was developing their new curriculum around the ACRL Information Literacy Competency Standards for Higher Education, which were the inspiration for the information management competencies. It was here that the library could offer educated leadership. Along with three additional faculty members, the librarian lead information management committee began its assessment journey.

Methodology

After much research and deliberation, the committee designed a pre/post assessment plan that would show growth over time. The population group would be incoming freshmen attending summer orientation. With no exposure to collegiate level research, assessing these students would give faculty a snapshot of what information literacy skills students have when entering college. Twenty percent of this population would be tested again in two years using the same assessment tool. The objective was to assess if these skills were affected after exposure to research and library instruction.

SUNY Fredonia's information management assessment tool is a thirty-two-question multiple-choice test. The first twelve questions do not require the use of a computer; rather, they assess basic research techniques. The remaining twenty questions require the use of the computer and utilize several specific online databases and Internet websites. Each question correlates to one of the three system-wide outcomes for the information management competency. The standard learning outcomes for information management must show that students can:

1. Perform the basic operations of personal computer use

2. Understand and use basic research techniques

3. Locate, evaluate and synthesize information from a variety of sources.

The questions that address the first outcome of performing the basic operations of personal computer use, focus on students' ability to perform certain basic tasks such as: emailing an article, identifying domain names and differentiating between document types (HTML v. PDF). The second outcome of understanding and using basic research techniques was fulfilled through questions focusing on Boolean phrasing, identifying citation styles and content, and navigating the SUNY Fredonia online catalog. The final learning outcome, which requires that students locate, evaluate and synthesize information, was fulfilled through questions where students conducted searches in the Wilson OmniFile Database, and interpreted results. Additional questions asked students to evaluate two web sites and compare and contrast their content.

Organized Chaos: Pre-test Collaboration

Some common challenges in campus-based assessment include "... identifying, recruiting and testing a representative sample" (Klassen 43). While designing the assessment model, the Information Management committee was advised to gather as much data from the pre-test as possible, because relocating 20% of these students two years later for the post-test would be challenging. Because SUNY Fredonia's summer orientation program has up to 1000 students attend (around 250 per four sessions), getting enough pre-test participation would not be a problem. However, the age of this population group posed the first problem. While most students are eighteen upon entering college, some are still seventeen. The seventeen year olds would need parental permission in order to participate. The second problem resulted from the sheer number of students being pre-tested. Given that potentially two hundred fifty students at a time would need computers, labs or classrooms, and proctors to monitor the assessment, the logistics were often overwhelming.

Collaborative efforts among campus departments helped diffuse most problems during the implementation of the pre-test. The Student Affairs office provided the data necessary to "withdraw" those students under eighteen from participating. Librarians, graduate students, and faculty volunteered as proctors. The Information Technology department offered several labs where students could simultaneously take the assessment test, as well as personnel that could assist with computer problems. The Disability Services office assisted a visually impaired student with special software needed to view the Internet. As with all large endeavors, nothing runs perfectly. Proctors forgot to come to their sessions and computers and air conditioners broke down during testing. Additionally web sites failed and databases became overloaded with users and "froze". Remarkably, five hundred valid assessment tests were gathered during the summer of 2002. Preliminary data was sent to SUNY central for system-wide assessment, with the assurance of post-test numbers coming in a few years.

Complete Chaos: The Post-test That Almost Wasn't

In spring 2004, the information management assessment committee began recruiting for the posttest by providing incentives to volunteers. Once again, there was no single class where students, needed for the post-test, were enrolled. Instead, a virtual class was created. All five hundred students that had taken the pre-test were enrolled in a course called Information Management Assessment Test (IMAT), through the course management system Blackboard. The committee was able to correspond with students through email and arrange for the sign-up of times and locations, electronically. Students were reminded of their involvement in assessment during orientation, and were asked to voluntarily complete the post-test for this project, with the added incentive of a $10 bookstore girl certificate for participating. Students had twelve opportunities to take the test. Web, radio and class announcements regarding the assessment were sent to students and faculty. In spite of these time-consuming efforts, only forty-four students took the post-test, well short of the one hundred students needed.

Why were students unwilling to participate in this post-test? In the article Low Examinee Effort in Low-Stakes Assessment: Problems and Potential Solutions, authors Wise and DeMars surmise that: "Although [assessment tests] provide useful data for institutions, [they] carry little or no meaning for the students themselves" (1). This is especially true when the assessment test is not correlated to a particular professor, class or discipline, where the student may feel obligated to participate. A ten dollar gift certificate was not large enough of an incentive for student commitment.

After much deliberation, another strategy was employed. The committee would attempt to find a pattern among the remaining students that took the pre-test. Were there any common courses that these students were enrolled in? If so, perhaps the faculty member would encourage their students to take the post-test. During the summer of 2004, the committee began looking for the ultimate needle in the haystack. The course scheduling software used at Fredonia during this time was unable to sort students by common classes. As a result, over four hundred student course schedules (on paper) were reviewed to find classes where students who took the pre-test were commonly enrolled. A significant number of pre-test students were enrolled in four classes. The faculty were notified and agreed to assist. Through this effort an additional twenty-three students that originally took the pre-test were added to the post-test numbers. The grand total of students that took the post-test, over two semesters of recruiting, was sixty-seven. This was still short of the 100 students needed. In essence, the post-test, and the original design of the assessment test, failed.

The Silver Lining: "Buy-In Occurred"

When the aforementioned faculty members were contacted, something ironic occurred. They were interested in knowing about the results. One English faculty member decided not only to have the identified students take the post-test, but also wanted his entire class of over thirty to participate. He even gave up class time for the students to take the assessment test. In another instance, a faculty member offered each student in her course five extra-credit points on the final exam, if they voluntarily took the information management assessment test.

Since the information management assessment efforts began in 2001, a transformation had begun on campus. The Library Instruction program had tripled the number of classes taught each semester. Opportunities for collaboration between librarians and faculty to infuse information literacy into the curriculum were also on the rise. In essence, entire courses were now equipped with assignments that could easily be used for assessment in the future. The tremendous involvement on campus during the information management assessment efforts, in addition to the strong information literacy program that was developing under the leadership of the Library Instruction faculty, this general education competency was getting the "buy in" that was needed four years earlier.

The Results

All was not lost on the information management pro/post test. The final number of students-that completed the post-test was one hundred nineteen. Although not all of these students took the pre-test, there was still the opportunity to compare incoming student information management skills with those students who were halfway through their academic careers at SUNY Fredonia. See website http://rapidintellect.com/AEQweb/fal2006.htm

The results of the campus-wide information management post-test shows an increase in the number of students who scored between 75-100% (Exceeding and Meeting Standards), no change in those students who scored between 70-74% (Approaching Standards) and a decrease in the number of students that scored below 70% (Not Meeting Standards) for all three categories. In other words, there was improvement of student information management skills over the two years of the assessment. Beyond showing this general statistical improvement, the data also pinpoints what specific questions students improved on, or are still having difficulty with. For example, students showed little significant change in their understanding of deciphering article citations. This information literacy skill is one that needs warrants attention from librarians and teaching faculty.

Conclusion

The journey described above is shared to offer insight into both the positives and negatives that resulted from the campus-based assessment approach that SUNY Fredonia took to measure information literacy. The failure of the assessment test's design was the most disappointing outcome. The low-turn out of post-test participants made the original design of assessing just those that participated in the pre-test impossible. A review of the literature reveals that this is not the only instance of assessment design failure. As discussed in the article "Doing the Best You Can with What you Have: Lessons Learned from Outcomes Assessment", even if the best laid assessment plans do not achieve the results you intended "some hard evaluation data, even if the data is less than perfect, are better than no data at all ..." (Carter, 36).

The biggest achievement of this journey is that "buy in" of information literacy is growing on campus. The catalyst for this paradigm shift was the information management assessment test, which not only provided value-added assessment data, it also served as a public relations tool. Coupled with a strong library instruction department that thrives on faculty/librarian collaboration, this skill is beginning to receive the attention it deserves on Fredonia's campus.

In conclusion, observations from this project coupled with a review of the assessment literature support the idea that assessment is most effective when linked with a specific department. In the case of information literacy, assessment will be more effective when established faculty/librarian collaborative assignments can be used to measure student outcomes, including the actual synthesizing of information. Although the multiple-choice test effectively offered a benchmark for skills, it did not necessarily show how students solve problems using information gathered (Rockman 193). Given the popularity of higher education assessment, the time to try out this new approach, will come soon enough.

Works Cited

Brasley, Stephanie Sterling. "Building and Using a Tool to Assess Info and Tech Literacy." Computers in Libraries 26.5 (2006): 6-48.

Carter, Elizabeth W. "Doing the Best You Can with What You Have: Lessons Learned from Outcomes Assessment." The Journal of Academic Librarianship 28.1 (2002): 36-41.

Francis, Patricia L. and Donald A. Steven. "The SUNY Assessment Initiative: Initial Campus and System Perspectives." Assessment & Evaluation in Higher Education 28.3 (2003):333-349.

"Guidelines for the Implementation of Campus-based Assessment in the State University of New York." 1999. Office of the Provost and Vice Chancellor for Academic Affairs. 29 May 2006. <http://www.suny.edu/provost/Implem_Guidelines.pdf>.

Hernon, Peter. "The Practice of Outcomes Assessment" Editorial. The Journal of Academic Librarianship 28:1 (2002): 1-2.

Klassen, Peter T. and Russell Watson. "Getting Real: Implementing General Education that Works." Academic Exchange Quarterly 5.1 (2001): 43-50.

Ratteray, Oswald M. T. "Information Literacy in Self-Study and Accreditation." The Journal of Academic Librarianship 28.6 (2002): 368-375.

Rockman, Ilene F. "Strengthening Connections Between Information Literacy, General Education, and Assessment Efforts." Library Trends 51.2 (2002): 185-198.

Stone, John and Steve Friedman. "A Case Study in the Inegration of Assessment and General Education: Lessons Learned from a Complex Process." Assessment & Evaluation in Higher Education 27.2 (2002): 199-211.

Weinstein, Daniel. "Outcomes Assessment is here to Stay, Get Faculty Buy In." Editorial. Academic Leader 22.1 (2006): 1-2.

Wellman, Jane V. "Assessing State Accountability Systems." Change Mar.-Apr. 2001 : 47-52.

Wise, Steven L. and Christine E. DeMars. "Low Examinee Effort in Low-Stakes Assessment: Problems and Potential Solutions." Educational Assessment 10.1 (2005): 1-17.

Kerrie Fergen Wilkes, State University of New York at Fredonia

Kerrie Fergen Wilkes, M.L.S., is a Reference and Instruction Librarian and recipient of the SUNY Chancellor's A ward for Excellence in Librarianship.
COPYRIGHT 2006 Rapid Intellect Group, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Wilkes, Kerrie Fergen
Publication:Academic Exchange Quarterly
Date:Sep 22, 2006
Words:3017
Previous Article:Channel one and effectiveness of media literacy.
Next Article:Aspiring school leaders reflect on the internship.


Related Articles
A collaborative MAP for early interventions.
What the Bleep Do We Know?
Drowning in Data?: How to Collect, Organize and Document Student Performance.
Caring for students' health needs: a unique interdisciplinary team of health professionals and education support staff provides a comprehensive...
PEO STRI, DAU bring training home to Team Orlando.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |