Printer Friendly

Service learning: model for community evaluation.

Abstract

One of the challenges of service learning at the university level is to create a collaboration that incorporates the skills and expertise of university students in a way that also serves the community. The psychology honors project described was an attempt at bridging this divide by offering a program aimed at community evaluation. A description of the collaboration is delineated with a new model of community evaluation and an after school program skills inventory.

Introduction

One of the challenges of service learning at the university level is to create a collaboration that incorporates the skills and expertise of university students in a way that also serves the community. For psychology undergraduate students, the understanding of statistics and research design is an important learning objective. One possible application of this skill set is the evaluation of community programs. The psychology honors project described below was one attempt toward bridging this divide often inherent in college level service learning. As an additional outcome, a new model was created to bridge the gap between research design and community evaluation.

During the past three years, a number of psychology students at the University of Minnesota Duluth participated in an honors seminar on community evaluation. The service-learning projects provided students with an experientially based application on a variety of topics and helped a number of community agencies toward the implementation of program evaluation efforts. The projects undertaken were coordinated in conjunction with a number of agencies affiliated with the Duluth Area Family YMCA including Mentor Duluth, True North AmeriCorps, and the 21st Century Community Learning Programs.

To bridge the application needs of undergraduate psychology students in statistics and research design with the program evaluation needs of the various youth serving agencies, the CEC-15 (Community Evaluation Components) model was developed. This model emerged from an understanding of the evaluation challenges that community organizations faced and the knowledge of what undergraduate students needed in order to synthesize their program evaluation skills. In addition a new measure (ACES) was developed to meet the difficult challenge of evaluating the outcome of after school programs and drop in centers. The ACES was created to measure a variety of skills including Academics, Computer Skills, Everyday Skills and Social Skills. This process of integrating what sometimes felt like orthogonal needs was a journey of service and learning for all involved.

Chaskin, George, Skyles and Guiltinan (2006) identified the importance of community access and capacity for assessment and planning information as well as the struggles inherent in community-research partnerships. They, along with Wilson (2004), identified issues of trust, mutual commitment and long-term relationships as critical components in such partnerships. The evolution of this ongoing service-learning collaboration seems to be successfully addressing such issues and mutual needs.

Literature

While more and more colleges and universities are making commitments to service learning (Berman, 1999; National Association of State Universities and Land Grant Colleges, 1995), few are requiting it (Antino, Astin and Cress, 2000). The onus for integrating service into the academic curriculum rests on faculty. While faculty are incorporating service learning at an increasing rate (Astin and Sax, 1998; Ward, 2000), faculty resistance is identified as a serious obstacle to service-learning. Faculty report that it takes too much time and effort, so they are not convinced of its educational values (Gray, Ondaatje, Fricker, Geschwind, Goldman, Kaganoff, Robyn, Sundt, Vogelgesang, and Klein, 1999). In addition, some faculty found service learning to be a deterrent to the traditional reward systems of the university (Berman, 1999).

More research of the academic outcomes of service learning may help to encourage faculty to get on board. However, there is a lack of awareness about what service learning actually is, how it can be incorporated into the curriculum and the variety of benefits involved.

Service learning is the integration of service into the curriculum in such a way that applications gained in either (service to the community or classroom curricular content) connects with and enhances the understanding of the other (Myers-Lipton, 1994). It is differentiated from community service or volunteerism in that the emphasis in service learning focuses on the learner in addition to the agencies and individuals served. Service learning similarly goes beyond experiential learning in its aim at helping with social problems (Gray, et. al., 1999).

While the outcomes of "service learning" on the various constituents involved are well documented and compiled in a summary of research between 1993 and 2000 (Eyler, Giles, Stenson, and Gray, 2003), definitions of service learning in contrast with other forms of volunteerism are not consistent across studies. Student outcomes included in the summary were categorized as 1) personal; such as personal efficacy, identity, communication and leadership; 2) social outcomes such as the facilitation of culture and racial understanding, increased citizenship and social responsibility; 3) learning outcomes such as critical thinking or cognitive development and 4) career development.

Astin, Vogelgesang, Ikeda, and Yee (2000) compared service learning with community service in terms of impacts on students. Their study included a national longitudinal sample of college students over four years. They tested the students'pre/post and found significant positive effects on all 11 outcomes for service participation. This included service-learning as well as other forms of community service. The outcomes were: academic performance: (GPA, writing skills, critical thinking skills), values (commitment to activism, promoting racial understanding), self-efficacy, and leadership (leadership activities, self-rated leadership ability, interpersonal skills), choice of a service career, and plans to participate in service after college. In this study, service learning, as compared to general community service, added significantly to the benefits for all outcomes except interpersonal skills, self-efficacy, and leadership. Such added benefits were strongest for the academic outcomes, especially writing skills.

It is these academic outcomes that faculty often mention as most suffering when the curriculum is "watered down" with activities such as service to the community (Gray et al. 1999). Also noteworthy to faculty might be that service participation has the strongest effect on students' decision to enter a service related field, regardless of the major they selected when they started college (Astin et al, 2000). Service-learning is particularly useful when it is part of students' majors. The most important factor associated with a positive experience was the amount of students' interest in the subject. The second most important factor was whether the professor encouraged class discussion (Astin et al, 2000).

Reflection is an important pedagogical consideration. Astin et al (2000) found evidence in both qualitative and quantitative data for the impact of reflection as a way to connect the experience to the course content. This occurred in the form of "process" of the experience with others, each other and the professor, and through journaling or assignments. Service learning allowed for this more than did community service.

Outcomes beyond those for students are represented less frequently in the literature, or by faculty, yet they exist equally. Gray, et al (1999) found that in addition to students, faculty indicated a heightened sense of civic responsibility and personal effectiveness as an outcome of service-learning. Similarly, universities reported improved relationships with the community as a result of service-learning programs (Gray et al, 1999; Clarke, 2000). Clarke (2000) found a favorable impact on the community as a result of university service-learning on a number of levels, from facilitating community project goals to community access to new resources. The social outcomes identified in the review by Eyler et al (2001) include such things as reducing stereotypes and facilitating cultural and racial understanding, increased social responsibility and citizenship skills, as well as the commitment to service. Such student outcome measures undoubtedly have direct impacts on the community.

Many of these outcomes were experienced in the service-learning initiative outlined below, as are others that were unexpected and not discussed in the literature. To facilitate the process of other faculty involvement with community service-learning collaborations, we have documented our own journey: the evolution, the outcomes, the benefits, the struggles and the lessons learned.

Evolution of Collaboration

In 2001, a number of community partners came together to create an AmeriCorps organization in Northeast Minnesota. One challenge of developing such an organization was to create an evaluation for the program as a whole. A particular obstacle was in not yet knowing which agencies would be involved or what evaluations they already had in place. In an effort to facilitate this, a faculty member from the psychology department at the University of Minnesota, Duluth created a service-learning course, in which students would help to assess and develop various agency needs into a centralized program evaluation.

In the first year, the YMCA was given a planning grant to develop True North, an AmeriCorps organization serving children and families in Northeast Minnesota. During this first year, twenty psychology undergraduates signed up for a two credit honors seminar. The seminar focused on the application of statistics and research design to program evaluation. Each student worked with one of the community partners in several ways: The students worked as volunteers to develop understanding of the agency and the needs of the children and families involved, they gathered information about the program evaluation efforts already in place at that agency, and they designed a further evaluation that would be appropriate for their particular agencies. Over time it became apparent that many organizations were struggling with the same problems in designing evaluation for their programs. The honors seminar provided a place in which mutual concerns could be discussed.

During the semester, each student designed an evaluation project. Part of the assignment was to consult with the agency concerned, conduct a search of literature and locate "model programs" and exemplary evaluations that had been conducted on similar programs across the country. Psychology students have the expertise necessary for a comprehensive literature search and the technology skills necessary for searching web based information systems. The honors seminar provided an opportunity to share these resources. While some of the projects did not go beyond the planning or design stage, others were submitted to the University of Minnesota Institutional Review Board (IRB) on the use of human subjects. After the studies were accepted, evaluation data was collected. At the end of the year, each student created a power point presentation, which was presented for a campus/community event called the Psychology Student Showcase. The students also created a notebook including their final paper, the model evaluations that they used, and relevant articles collected. The notebooks were presented to the seventeen different agencies involved and the power points were videotaped and access provided through the department webpage.

In the summer of 2004, True North AmeriCorps was funded and forty AmeriCorps members began their year of service. Eight part-time positions were reserved for psychology students who would continue their work with the honors project, while they served their sites as AmeriCorps members. These students needed to not only meet the qualifications and be accepted through the AmeriCorps screening process, but also were required to meet departmental qualifications relating to course pre-requisites and GPA requirements for honors course enrollment.

The honors students each worked ten hours per week with a community agency as part of their AmeriCorps member service requirements. Through the psychology honors course, they also designed and conducted an evaluation for that agency. Students' time spent and duties administered were separated for each of these roles. AmeriCorps volunteer time afforded students "participant observer status" which we believed to be critical for this program evaluation effort. The actual program evaluation component of the project was reflected in the honors course requirements. The combination proved effective in terms of making the course an integrated experience for students. While the focus of the course was program evaluation, the on-site volunteer component of the AmeriCorps program also provided lessons that went beyond program evaluation. In the second year all of the evaluations were submitted to the IRB, studies conducted and results presented at the psychology student showcase. In addition these eight students were part time AmeriCorps members and several students went on to work in the agencies they served. Other projects were continued through the University Research Opportunity Program (UROP). For all of the students, this project became a significant experience.

Outcomes

Two unanticipated outcomes emerged as a result of this collaboration. To make the transition between research design and program evaluation, a model (CEC-15) was designed to facilitate the communication between students and community agencies [See Table 1.] See issue website http://www.rapidintellect.com/AEQweb/spr2007.htm Meanwhile, a search of the literature revealed the difficulties in evaluating after school programs in which children engaged in a variety of activities. To meet the need for a wide ranging evaluation with a number of subscales, the ACES was designed. While students' work created the first draft of the ACES, agency personnel quickly stepped up to modify, expand and implement the questionnaire.

CEC-15

During the first year of the AmeriCorps planning grant, twenty students were assigned to 17 agencies. One of their assignments was to discover what already existed in terms of program evaluation. In general they found anonymous satisfaction surveys, which were often unanalyzed. Agency staff would sometimes hand the students a stack of papers and ask for help in analyzing them, but the surveys generally had such a variety of questions that nothing could be analyzed beyond simple percentages. The difference between a satisfaction survey and an outcome study seemed to be a big step which would require the various agencies to build their capacity for program evaluation and their interest in conducting a comprehensive study. To conduct these studies, agencies would need parental consents as well as participant tracking, neither of which were standard practices in most of the community agencies involved. Some of the agencies kept no computer files and others did not have Internet capabilities, but both of these would be essential if a comprehensive evaluation was designed. However, the skills of university students proved valuable on more levels than had been anticipated because the students were invested in providing their newly developed skills and agencies were willing to help students with their projects by providing any information they needed. This eliminated the barriers often created by university "academics" who fail to communicate with agency personnel and their real life issues (Chaskin, Geoge, Skyles & Guiltinan, 2006). However, the students were also struggling to find practical applications of research design and statistics. The challenges faced by community agencies such as drop-in centers serving youth expanded lessons learned in the traditional undergraduate psychology curriculum.

The CEC-I 5 was developed to bridge some of the challenges faced in communicating with both university students and agency staff. It provided a language so simple, and a design strategy so basic, that everyone could easily adopt it. [See Table 1--CEC-15] The essence of the CEC-15 was to move from anonymous data to an integrated data base with information on each participant. Participants could no longer be anonymous and data needed to be coded and entered into a computer. Informed Consent, Participant Tracking and Demographics became baseline information for each agency conducting program evaluation. This created some painful dissonance between program philosophy and the principles of program evaluation. For example, drop in centers rely on an open environment where youth come and go at will, and parents who are not highly involved in the lives of their children are not likely to return consent forms. In some cases, simple evaluation procedures required organizational change. Staff perception of program evaluation as counter productive and lacking in utility needed to be heard and addressed. For many of the agencies, such efforts in the past had only resulted, for them, in time taken away from the children they served. All too often, grant mandated data collection failed to result in any information that was useful for programmatic considerations. Rather, government and university personnel had made a uni-directional use of data without giving anything back to the agency. The avoidance of this "black hole" for community agencies was, above all else, one of the mandates for students involved in this course. Agency staff needed to be involved in the design and they needed to see results that were actually helpful to the children and families served as well as for the program itself.

This emphasis on follow through in program evaluation was built into the CEC-15 model by the creation of the process components across the top [Table 1 ]. While design and data collection are part of the necessary steps of evaluation, the process is incomplete without the sharing and dissemination of the findings.

Also reflected in the CEC-15 model was the iteration for students and agencies of the multitude of possibilities in program evaluation beyond the notion of an experimental and control group. For most of the youth serving agencies involved, such evaluation designs resulted in an ethical issue. No one wanted to deny needy children important resources. Instead the CEC-15 suggested comparison groups between those who came frequently and those who rarely attended. This is the variable as it naturally occurs, so it is the most important consideration for practical purposes.

As a decision-making model the CEC-15 provided a number of options for comparison groups as well as options for what to measure. For example, one program might look at performance measures such as how well a student performs on an external measure such as school grades or attendance. Other programs might be more interested in qualitative measures such as stories or drawings. One program was already using live observations by keeping track of how much time kids spent on school work vs. games in the computer center. Unfortunately, this center lacked the ideas and resources as to what to do with this data which had never been analyzed. The CEC-15 suggested that they make comparisons such as children's age and gender as well as outcomes in terms of school grades. By combining these components of the CEC-15, the student created a "design" for the center to use in analyzing their data. One task for our students was to help the agency in figuring out what was the best measure for them. The CEC-15 served as a reminder that they would need at least two such measures in order to have an analysis.

The CEC-15 addressed the importance of having comparisons for program evaluation as well as for communicating the kinds of comparisons that can be made. It was these comparisons that our students were accustomed to making in their experimental design courses, but the CEC-15 provided additional options beyond those typically used in research design. Examples include the evaluation of multiple respondents such as how children and their parents compare in assessing the child's strengths and needs. Other comparisons related two different programs within an agency such as structured versus unstructured after school programming or comparisons between who used the services regularly compared with those utilizing the programs on occasion. Pre-post designs compared beginning of the year measures with end of the year or longitudinal measures.

Examples of program evaluation designs using the CEC-15 elucidate the practical application of the model for agencies and students. The model used by the Duluth Boys and Girls Club to evaluate their Goals for Graduation program included information for CEC items 1, 2, 3, 5 and 13. Informed consent (CEC-1), demographics (CEC-2) and participant tracking (CEC-3) were obtained from a computer tracking system already in place. Two comparison groups (CEC-5) were measured. One group was composed of children who were enrolled in Goals for Graduation and another group included children who were not in this program. A second comparison related to how often children attended the tutoring center. The outcome for these four groups was assessed by the external measure (CEC-13) of school grades. This design considered a program that the club was already conducting and created a design for measuring it.

An example of a different design involved the Duluth Youth Agency Coalition, composed of nine different youth serving agencies. Agency directors wanted to create a questionnaire to evaluate children's' experiences with different programs. This resulted in a two part evaluation. Part one included informed consent (CEC-1) with a qualitative component (CEC-14) involving focus groups. From the focus groups a self report inventory was created (CEC-10). The various agencies then gave the questionnaire to additional participants and used demographics such as age and gender in making comparisons between the various sites.

Throughout the two years of this service-learning course, students conducted variations of all of the CEC-15 program evaluation components. The needs of students and the community agencies led to the development of the CEC-15 and it proved to be an effective vehicle in communicating evaluation design options that were simple and feasible, yet would also result in meaningful feedback for the agencies.

ACES

As part of the program evaluation model taught in the service-learning course, students were required to find model programs and model evaluations to utilize with their agencies. In most cases they were able to adopt an evaluation from another agency around the country. However, many students were unable to find evaluations of drop-in centers and after school programs that fit the diverse needs of their agencies. To address this, the faculty member decided that it would be a good activity for the honors seminar students to pursue creation of an evaluation instrument that could be used to measure after school programs. This measure would need to be broad in scope because some agencies focused on tutoring and computer skills, while others focused on sports, outdoor activities, dance, music or cultural events. The challenge was the need of a diverse instrument that could capture a variety of outcomes. In the spring of the first year the students generated a long list of items for the ACES questionnaire. Since the ACES seemed so promising it was included in the evaluation for the AmeriCorps grant as well as the 21st Century grant. The following summer, staff representing several agencies worked together to clarify the ACES subscales. They also added parent and staff forms of the ACES, so that the information would not be entirely based on the child's self report. The ACES I was developed with its subscales of; academics, computers, everyday skills, and social skills.

The ACES then expanded to child, staff and parent forms of the questionnaire with various levels of the questionnaire for children from kindergarten through high school. In addition, the ACES II was developed to assess attitudes, confidence, enthusiasm and special skills. In the next year, AmeriCorps members began to give the ACES and honors students continued the work of evaluating the new instrument. The creation of a questionnaire is a long process, but current students in the Undergraduate Research Opportunities Program (UROP) continue to work on the reliability and validity of the ACES. Although it is still under construction, we now have two years of pre-post data from a number of AmeriCorps sites and the ACES has been included in subsequent grants. This is quite the evolution for a questionnaire, which was originally designed as a student project.

Conclusions

The True North ArneriCorps program has now expanded and this fall placed 73 members at 54 sites in Northeastern Minnesota. UMD faculty and students are still involved through internships and UROPs. The ACES continues to emerge as an evaluation for after school programs.

Thomas (2005) discusses the need for a balance between learning and service in service-learning. Her argument is that often the learning takes precedence over the service. This course, because of its connection with the AmeriCorps program, included more emphasis on service than the typical service learning course. Astin, Vogelgesang, Ikeda, and Yee (2000) stressed the importance for students in not perceiving the service experience as an "added on" requirement for the course. Because the students were an essential part of the community of AmeriCorps members, their commitment to the service went beyond that of a class expectation. Rather, their involvement in the course was enhanced by their positions as AmeriCorps Members. The service came first.

Note

Paula Pedersen, Sandy Woolum and Blair Gagne collaborated on the creation and implementation of this service learning initiative.

References

Antino, A.L., Astin, H.S. & Cress, C.H. (2000). Community service in higher education: A Look at the nation's faculty. A Review of Higher Education, 23(4), 373-397.

Astin, A. W. & Sax, L.J. (1998). How undergraduates are affected by service participation. Journal of College Student Development, 39(3), 251-264.

Astin, A.W., Vogelgesang, L.J, Ikeda, E.K. & Yee, J.A. (2000). How service learning affects students. Higher Education Research Institute. University of California, Los Angeles.

Berman, G.L. (1999). Antecedents and strategies for the successful implementation of service learning programs in higher education. Unpublished Dissertation, University of Massachusetts, Boston.

Chaskin, R.J., George, R.M., Skyles, A. & Guiltinan, S. (2006). Measuring social capital: An exploration in community-research partnership. Journal of Community Psychology, 34

Clark, M.M. (2000). Evaluating the community impact of service initiatives: The 3-I Model. Unpublished Dissertation, Peabody College, Vanderbilt University.

Eyler, J.S., Giles, D.E., Stenson, C.M., & Gray, C.J. (2001). At a glance: What we know about the effects of service-learning on college students, faculty, institutions and communities, 1993-2000: 3rd Edition. Corporation for National Service: Learn and Serve America National Service Learning Clearinghouse.

Gray, M.J., Ondaatje, E.H., Fricker, R., Geschwind, S., Goldman, C.A., Kaganoff, T., Robyn, A., Sundt, M., Vogelgesang, L., & Klein, S.P. (1999). Coupling service and learning in higher education: The Final report of the evaluation of the Learn and Serve America, Higher Education Program. The RAND Corporation.

Myers-Lipton, S.J. (1994) The effects of service learning on college students' attitudes toward civic responsibility, international understanding, and racial prejudice. Unpublished Dissertation. University of Colorado.

National Association of State Universities and Land Grant Colleges (1995). Urban Community Service at AASCU and NASULGC Institutions: A Report on Conditions and Activities. Washington, DC.

Thomas, J. (2005). Keeping the "Learning" in Service-Learning. Academic Exchange Quarterly, 9(1)

Ward, S. (2000). Transforming the instructor: Service-learning integrated into a community college curriculum. Paper presented at the annual meeting of American Educational Research Association, New Orleans, LA.

Wilson, D. (2004). Key features of successful university-community partnerships. In New directions for civic engagement: University Avenue meets main street (pp17-23). Charlottesville, VA: Pew Partnership for Civic Change.

Paula J. Pedersen, University of Minnesota-Duluth

Sandy Woolum, University of Minnesota-Duluth

Blair Gagne, Duluth Area Family YMCA

Pedersen, Ed.D., is Assistant Professor of Psychology. Woolum, Ph.D., is Associate Professor of Psychology. Gagne MS, is Mentor Duluth & True North AmeriCorps Executive Director
COPYRIGHT 2007 Rapid Intellect Group, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2007, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Gagne, Blair
Publication:Academic Exchange Quarterly
Date:Mar 22, 2007
Words:4409
Previous Article:Using teen chick lit novels to teach marketing.
Next Article:Predicting academic success in undergraduates.


Related Articles
Outcomes from cross-cultural service-learning.
Peer evaluations and the language classroom.
Academic service-learning in teacher education.
Public dialogue series as service-learning.
Content-based ESL instruction and curriculum.
Bridging disciplines through service-learning.
Media richness & individual perceptions of teams.

Terms of use | Privacy policy | Copyright © 2022 Farlex, Inc. | Feedback | For webmasters |