Chapter V: bridge program evaluation and continuous improvement.
To evaluate a bridge program, it is essential to continue this process by staying abreast of the requirements for advancement to the next levels of education and employment, tracking how readily participants are in fact advancing, and using this information to modify the programs in ways that improve participants' outcomes.
The following steps will help evaluate and continuously improve the bridge program. The first set of steps involves collecting data needed to gauge the performance of students in a bridge program and track their outcomes when they leave. The second set, under "formative evaluation," outlines steps for evaluating how well the program is being implemented and how to improve program operations. The steps under "summative evaluation" describe how to tell whether the program is effective. The final, and in many ways the most important, steps involve using the findings from formative and summative evaluations to improve program operation and outcomes over time.
Data Collection for Tracking Student Program Performance and Outcomes
* Secure informed consent from every participant upon entry to participate in an evaluation of the program. Figure 15, p. 85, presents a sample informed consent form. **
* Collect data on the demographic characteristics of every participant upon enrollment in the program. The data elements listed in table 5, p. 86, are the minimum set necessary to conduct a summative evaluation of the project. Programs may want to collect other sorts of information on participants based on funding requirements and objectives.
* Collect data on the performance and immediate outcomes of every participant. Table 6, p. 87, lists a recommended set of data elements.
* Interview participants, instructors, and staff around the midpoint and at the end of each initial program cycle to identify what is working with the program, what is not, and how it can be improved. Every effort should be made to interview participants who drop out to find out why they are leaving and what can be done to prevent attrition in the future.
* Regularly interview participants who have completed the bridge program, employers that have hired them, and faculty and support staff at the next level of education about the strengths and weaknesses of the program and ways it can be improved.
* Examine the performance of the program over time benchmarked against its own historical performance (or that of similar pre-existing programs) in terms of participant retention and completion rates, tested basic skills gains, job-placement rates and wage levels, and rates of advancement to higher levels of education and training.
* Document the main components of the program being evaluated, including program content, duration, and support services. It is important to accurately describe the eligibility criteria for program entry and the process by which participants are recruited and selected. Describe any major changes in program design made during the course of each cycle. Interview program faculty and staff to better understand the reasons for these changes. (This will make it possible to more accurately compare and contrast outcomes across programs and over time.)
* Analyze the full costs of the program to all partners, including costs for staff, materials and supplies, equipment, and administrative overhead. ("Bridge Program Costs and Funding," pp. 66-74, discusses costs in detail.)
* Track the further education participation outcomes of leavers (completers and non-completers) from each program level using data from partner institutions, the state, or, if data are not available from those sources, from the National Student Clearinghouse, which collects data on enrollment in undergraduate programs across the country.
* Collect data on employment and earnings of program leavers (completers and non-completers) for at least eight quarters prior to and eight quarters following program participation using unemployment insurance wage records. ***
* Secure data on an appropriate comparison group with which to compare labor-market outcomes of program participants. Use a quasi-experimental design to compare the labor market (and, if possible, further education) outcomes of pairs of program participants and comparison group members matched statistically based on demographic characteristics (age, gender, race, education, and prior earnings). ([dagger])
* Bring together program staff, instructors, and partners to study the data from the formative and summative evaluation activities, discuss what is working and what is not, diagnose the causes of the barriers that prevent participants from succeeding and progressing to the next level, and decide on ways to modify the program to promote participant advancement.
* Track the effectiveness of program modifications in improving outcomes.
* Shift resources (people, money, facilities) to support program strategies that prove effective in supporting participant advancement.
* Start a network of other bridge programs to share lessons learned, best practices, resources and tools, approaches to team development, and so on.
Figure 5: Sample Informed Consent Form
PERMISSION TO PARTICIPATE IN AN EVALUATION OF THIS TRAINING PROGRAM
The training program you are in is provided by [NAME OF PROVIDER] with funding from [NAME OF FUNDER(S)]. Both [SHORT NAME OF PROVIDER] and the funding agency(ies) would like to see how effective this program is in helping you and other participants get better jobs and pursue further education and training.
We would like to have your permission to have access to information on your employment, wages, and participation in education and training for use in evaluating this training program. This information is available from state agencies such as the [UNEMPLOYMENT DEPARTMENT AND HIGHER EDUCATION BOARD]. All of the information we collect on you from these agencies will not be shared with anyone else, and no one will ever be able to connect this information with you personally.
You do not have to participate in this evaluation. If you choose not to participate, [NAME OF PROVIDER] cannot use information on you to evaluate the program.
Please read the following statement and then sign and date below if you agree to give [NAME OF PROVIDER] access to information needed to evaluate the program.
I hereby give [NAME OF PROVIDER] permission to use the information I have provided and information collected by state agencies on my employment and further education once I leave this training program. I understand that this information will be used to improve the quality of this program for future students, and that I will not benefit directly. All information about me and my job and education outcomes will be kept strictly confidential and will be used for evaluation purposes only. I understand that I do not have to give this information if I do not want to.
PRINT FULL NAME
** "Informed consent" is a process in which the risks, benefits, and requirements of a research study are explained to persons invited to take part in a study. Before entering the study (in this case, a program evaluation) a participant should sign an informed consent form, which should contain a description of the goals and methods of the study, the risks involved, and the steps that will be taken to protect participants' confidentiality. Participants can elect whether or not to take part in the study. If they opt out, information about them cannot be used.
*** By law, all employers with employees eligible for unemployment insurance (UI) are required to report to the state the quarterly earnings of every employee they employ in that state. States use this information to calculate unemployment insurance benefits. In most states, the employment security agency is responsible for issuing UI wage data. In most states, providers of education and training (or the state agencies that fund them) can gain access to UI wage data for purposes of evaluating and improving programs if they have the necessary consents from participants and sign a data-sharing agreement with the state agency that collects UI data.
([dagger]) Hollenbeck and Huang (2003) used this method and data on Employment Service participants as a comparison group to assess the net impact of several workforce training programs in the Washington State workforce system.
Toni Henle, Women Employed Institute
Davis Jenkins, University of Illinois at Chicago Great Cities Institute
Whitney Smith, Chicago Jobs Council
Table 5: Data Elements to Collect on Bridge Participants Upon Program Enrollment MEASURE DATA ELEMENTS Personal * Social security number Identifier Age * Date of birth Gender * Male or female Race/Ethnicity * White, African-American, Hispanic/Latino, Asian, or other Education * Earned high school diploma? (y/n) * Earned GED? (y/n) * Previously enrolled in job-training program? (y/n) --If yes, name of provider, program name, and date enrolled * Previously enrolled in at least one college-level class (y/n) --If yes, college name, program (or course) name, and date enrolled Recent Work * Currently employed? (y/n) History --If yes, name of current employer, job title, and one-sentence job description --If yes,hourly wage --If yes, hours per week currently working --Receive health benefits from employer? (y/n) * Number of months employed in the past 12 months Native Language * Native language is English? (y/n) Disability * Disability that would require special support during the program (y/n) --If yes, nature of disability Education and * Main reason for enrolling in program Career Goals * Main goal for employment in the next 12 months * Main goal for further education (beyond this program) in the next 12 months Tested Basic * Test reading and math levels Skills at Entry Table 6: Data Elements to Collect on Bridge Participant Performance and Initial Outcomes MEASURE DATA ELEMENTS Personal * Social security number Identifier Start Date * Date participant started in program Retention * Participant successfully completed the program (y/n) --If no, reason for leaving the program Tested Basic * Test reading and math levels, using the same Skills at instrument used upon enrollment Completion Job Placement * Employed for at least 30 days within 12 months of program completion? (y/n) --If yes, new job held during the program? (y/n) --If yes, start date --If yes, name and address of employer --If yes, job title and one-sentence job description --If yes, hourly wage --If yes, hours worked per week --If yes, receive health benefits? (y/n) GED Completion * Completed GED? (y/n/NA) Certification(s) * Earned a certification recognized by employers and/or educational institutions? (y/n/NA) --If yes, name of certification and issuing agency Further * Enrolled in further education and training within Education and 12 months of program completion? (y/n) Training --If yes, date enrolled --If yes, name of college or school --If yes, name of program and one-sentence description --If yes, participant's goal for the program
|Printer friendly Cite/link Email Feedback|
|Publication:||Bridges to Careers for Low-Skilled Adults: A Program Development Guide|
|Date:||Jan 1, 2005|
|Previous Article:||Chapter IV: bridge program implementation and management.|
|Next Article:||Chapter VI: statewide bridge program support.|