Printer Friendly

Making the case: demonstrating the impact of career and employment services.

Career development practitioners understand that the value of career and employment services extends beyond attachment to the labor market. However, "soft" outcomes such as increased self-efficacy and improved ability to manage transitions are difficult to measure. Convincing funders and policy makers of the significance of such outcomes may be challenging; consequently, important interventions may not be mandated or funded. In this article, the importance of effective program evaluation is highlighted, limitations of current evaluation models are identified, and a draft evaluation framework that facilitates tracking long-term impacts of career and employment services is applied to critique a government-commissioned summative evaluation report.

**********

Within the career and employment services sector, the importance of conducting effective research is widely recognized. The needs of individual career practitioners and the organizations in which they work are changing constantly, affected by regional and global economies. Career development practitioners need to document and communicate successes; to inform program and service design and development, they also need information about what is not working well. In this article, our goal is to present service providers with an alternate framework for conducting more effective research and to alert funders and policy makers to challenges with some of the research that influences their decision making.

THE CONTEXT

In a global economy, people have much to learn from work beyond their borders. Relevant to this review are international symposia on career development and public policy convened between 1999 and 2007. In response to early symposia, the "Career Guidance and Public Policy: Bridging the Gap" report noted that "the longer-term evidence [of career guidance] is quite weak, and obtaining it will require more and better longitudinal research" (Organisation for Economic Co-operation and Development, 2004, p. 8). The 2003 Pan-Canadian Symposium on Career Development, Lifelong Learning and Workforce Development (hereinafter the Pan-Canadian Symposium) confirmed the need for more effective research, acknowledging that policy makers and funders need evidence clearly demonstrating the successes that career services providers claim to achieve. However, at the time, no one could provide a clear system for gathering that evidence.

One outcome of the Pan-Canadian Symposium was the formation of the Canadian Research Working Group on Evidence-Based Practice in Career Development (CRWG; Baudouin & Hiebert, 2007), resulting in the draft of a comprehensive framework for evaluating career development services (Baudouin et al., 2007). Concurrently, research on evidence-based practice has proceeded internationally, within the United States (Wall, 2007), United Kingdom (Hughes, 2004), and many other countries.

In British Columbia, Canada, significant changes to publicly funded career services are anticipated as responsibility for allocating federal labor market development programming funds shifts to the provincial government. To prepare for these changes, an unprecedented collaboration has occurred between diverse professional associations representing the career development community, resulting in formation of the British Columbia Career and Workforce Development Alliance (hereinafter the Alliance). The Alliance is coordinating efforts to ensure that career development sector representatives will play an influential role in consultations expected to shape the future direction of this sector.

In partial fulfillment of this mission, the Alliance has commissioned topical papers to prepare career practitioners and service providers for anticipated regional changes. This article has been condensed from one of those papers. By examining relevant research and recommending specific evaluation strategies, we hope to equip career practitioners and organizations to document, evaluate, and communicate the important work that they do. A secondary purpose is to equip funders and policy makers to critically review relevant research and make informed requests for data that will more effectively portray the impact of their investments.

THE COMPLEXITY OF RESEARCH AND EVALUATION

The importance of research and evaluation is not in question. At the Pan-Canadian Symposium, each of the stakeholder groups (i.e., career practitioners, employers, policy makers, and academics) agreed that documenting the influence of career and employment programs and services is essential. The challenge, however, is in how to accomplish this. Many of the key players have not been trained in how to design research or conduct program evaluations. The following sections are intended as a very brief introduction to both.

What Is Research?

Research has been defined as "a systematic process of collecting, analyzing, and interpreting information (data) in order to increase our understanding of the phenomenon about which we are interested or concerned" (Leedy & Ormrod, 2005, p. 2). Some research is quantitative (i.e., factors can be assigned numerical values); other research is qualitative (i.e., themes and patterns are extracted from the data).

All research, however, is influenced by underlying assumptions about what is worth measuring, how and when to best measure it, who should be included, and why the topic is important. These assumptions influence the questions investigated in a research project and provide a context within which the results are interpreted. Because research is guided by a specific research question or hypothesis, outcomes will be influenced by the purpose of research, which in turn influences the research design. This is important to acknowledge--essentially researchers measure what they value and may observe that which they seek. A research question is not right or wrong; however, the question does shape results by privileging consideration of some variables over others.

For example, a service provider setting out to measure the success of an interview skills workshop likely believes that such interventions are important. The funder may have mandated program evaluation; perhaps a belief also exists that the workshop could be improved or that it is exceptionally successful and worth documenting.

A desire or mandate to conduct research, however, is not enough. Preliminary research by the CRWG indicated the following:
   Evaluation was considered to be difficult because of the complexity
   of determining and measuring outcomes and difficulty following-up
   with clients. Funders require evaluation, but there is a lack of
   training in evaluation, evaluation is not funded, and standardized
   evaluation protocols and definitions do not exist. (Lalande,
   Hiebert, Magnusson, Bezanson, & Borgen, 2006, p. 5)


The Challenge of Proving Causality

Demonstrating causality, when researching interventions within the career and employment services sector, is particularly challenging (i.e., what is the proof that a specific program or service, and not some other unmeasured variable, actually contributed to participant success). For example, if 80% of participants in an interview workshop were hired after the first job interview, the workshop seems to be a resounding success. However, if 90% of the members in a comparison group achieved similar success without interview skills training, the workshop may be deemed ineffective; other variables (e.g., low unemployment rate) may be causing or responsible for the job seekers' success. This "attribution problem in evaluation" (p. 4) is highlighted by Mason and Tereraho (2007). They emphasized the importance of an experimental or quasi-experimental program design in investigating causality.

CAREER AND EMPLOYMENT SERVICES RESEARCH AND EVALUATION: STRATEGIES AND FRAMEWORKS

Hughes (2004) cautioned that "assessing the overall impact of career guidance presents real challenges for practitioners, managers, trainers, policy makers, and researchers" (p. 1). The CRWG identified that "there is a great need to improve how and what is being measured" (Lalande & Magnusson, 2007, p. 133), reporting that larger, publicly funded agencies were more likely than smaller ones or educational institutions to systematically report outcomes. Mason and Tereraho (2007) lamented that "many public programs and policies do not provide quantifiable outcomes, and this limits conclusions on value for money" (p. 1). Together, the consensus seems to be that many service providers are not measuring what needs to be measured, which limits policy makers' and funders' access to appropriate information to guide decisions.

Some of these assessment challenges reflect the lack of standardization in the profession (Hughes, 2004). Others reflect the complexity of client change (Bright & Pryor, 2005). A useful research and evaluation framework must be dynamic enough to capture complexity, yet standardized enough to be accessible and understood by diverse stakeholders.

Several recent projects have provided a good base for research and evaluation within the career and employment services sector (America's Career Resource Network, n.d.; Hughes, 2004). Conger and Hiebert (2007) presented a thought-provoking approach to assigning financial value to employment equivalence as a viable outcome measure of career and employment programs. Similarly, Mason and Tereraho (2007) suggested strategies to strengthen the oft-requested value-for-money analyses.

On a broader scale, within Canada, the CRWG conducted preliminary research to "determine the state of evidence based practice in Canada" (Lalande & Huston, 2005, p. 1) and proposed a comprehensive evaluation framework (Baudouin et al., 2007; Lalande et al., 2006). Because this draft framework was grounded in research about current sector-specific evaluation practices, it provides a foundation for critiquing published research in the career and employment services sector. In the following sections, we briefly summarize the framework and then use it to examine a government-commissioned evaluation of career and employment services.

CRWG DRAFT EVALUATION FRAMEWORK

The framework proposed by the CRWG comprises three elements: inputs, processes, and outcomes (Baudouin et al., 2007). Each element is defined as follows:

* Inputs include all of the resources available to provide programs or services (e.g., staff, funding, facilities, infrastructure, and community resources), are influenced by mandates and guidelines, and may constrain or support services offered.

* Processes include generic and specific interventions comprising the programs and services offered (e.g., working alliance skills, group activities, online and paper-based resources, measures of satisfaction, and other indicators of quality service).

* Outcomes include all indicators of client change (e.g., mastery of learning objectives, changes in attitudes and skills, engagement in training, employment status, and independence from financial supports).

Programs and services are typically designed to achieve specific outcomes. Taking a narrow approach to defining desirable outcomes will, by definition, limit how the success of a program can and will be measured. For example, the governments of Canada and British Columbia currently have a comanagement agreement outlining how labor market development programs, called Employment Benefits and Support Measures (EBSMs), are delivered in British Columbia. The goal of EBSMs is to provide services to individuals, to assist individuals in obtaining sustainable employment, and to reduce payments from the Employment Insurance (EI) program (Human Resources and Social Development Canada [HRSDC], 2004). As such, program success is calculated by measuring the savings to an active El claim (i.e., if an individual returns to work before his or her EI claim expires, unpaid benefits are considered net savings).

Taking a broader approach to defining outcomes, such as proposed by the CRWG's draft evaluation framework, would require a more sophisticated program evaluation design and comprehensive measurements. However, the result could be a more relevant and long-term assessment of client change.

ANALYSIS OF SUMMATIVE EVALUATION REPORT AND RESPONSE

To illustrate limitations of current career program and service evaluations and to encourage consideration of the comprehensive CRWG draft evaluation framework, we reviewed a publicly available report. The HRSDC-commissioned "Summative Evaluation of Employment Benefits and Support Measures under the Terms of the Canada/British Columbia Labour Market Development Agreement" (hereinafter the Summative Evaluation) examined the Skills Development Employment Benefit, Targeted Wage Subsidies, Self-Employment, Job-Creation Partnerships, Employment Assistance Services, and Labour Market Partnerships (HRSDC, 2004). We used the CRWG's framework to critically examine this evaluation report of EBSM programs within British Columbia. The Summative Evaluation is currently posted on the HRSDC's Web site, with no indication of a more recent evaluation report to replace it. It is likely--and unfortunate--that data collected 7 years ago is being used to inform current funding decisions. Plans were announced in February 2008 for a full devolution of the Labour Market Development Agreement (LMDA), entailing a significant transfer of responsibility and funds within British Columbia; thus, significant decisions may have been based on outdated information.

To organize the critique of the Summative Evaluation (HRSDC, 2004), we have conducted a Strengths, Weaknesses, Opportunities, and Threats (SWOT) analysis. The SWOT analysis is intended to serve multiple purposes. First, for service providers, this analysis may highlight the importance of tracking the rich data necessary for a comprehensive program and service evaluation. Second, for policy makers and funders, this analysis may reveal limitations of previous research and increase awareness of the range of benefits of career and employment programs and services. Third, this analysis may inform a redesign of the current program delivery model. Finally, by comparing the Summative Evaluation with the CRWG's draft evaluation framework, we hope to inspire stakeholders to more actively contribute to comprehensive research and evaluation projects.

Strengths

The Summative Evaluation (HRSDC, 2004) measured what it set out to. EBSMs were designed to save EI money; the Summative Evaluation provided a cost--benefit analysis (CBA) of each of the six EBSMs. To calculate cost--benefit ratios in a CBA, monetary values are assigned to various tangible and intangible factors (Mason & Tereraho, 2007).

The Summative Evaluation (HRSDC, 2004) also provided an extensive evaluation of the EBSMs, albeit on limited outcome measures, and recommended further research. For example, the typical outcome measure of "return to work" was identified as neither the only nor always the most appropriate outcome to consider (HRSDC, 2004). The results of this comprehensive evaluation are publicly available, providing all career development sector stakeholders with a valuable starting point for further research and evaluation.

Weaknesses

Though noted as a strength, the CBA approach to program evaluation can also be considered a weakness. Mason and Tereraho (2007) found that many studies that purported to take a CBA approach were, in fact, performing a cost-effectiveness analysis (CEA). Simply put, a CEA approach measures the desired outcome (e.g., return to work) against the cost of the intervention (e.g., interview skills workshop). The Summative Evaluation (HRSDC, 2004) addressed the differences between CBA and CEA approaches, defending the decision to proceed with the former. However, whether the measures were sufficient to paint a complete and accurate picture of the impact of EBSMs is questionable. Herr (2003) noted the following:
   Concentrating on the development of cost--benefit analyses can be
   considered a strategic opportunity in an era of growing
   expectations for public accountability. To do so, however, will at
   a minimum require clarity about the level and kind of evidence that
   policy makers and administrators want about the benefits of career
   counseling and related interventions and how they view their return
   on investment for supporting such processes. (pp. 16-17)


When the Summative Evaluation (HRSDC, 2004) is compared with the CRWG framework, several pieces are missing in each of the three elements (i.e., inputs, processes, and outcomes). The Summative Evaluation considered only the initial investments by government (i.e., funds) and clients (i.e., time). Longitudinal data (e.g., the influences of interventions on long-term attachment to the labor market and increases in employment earnings) were not included in the CBA. To be fair, this exclusion may be less a weakness of the Summative Evaluation than of the overall LMDA Accountability Framework, which solely attends to short-term results. However, in comparison with the rich data that could be used to measure the success of career and employment programs and services, the short-term focus and very limited input and outcome measures may be viewed as weaknesses.

Opportunities

Ongoing research was acknowledged as important in the Management Response to the Summative Evaluation (HRSDC, 2004, p. xviii); undertaking such research would present many opportunities for service providers to collect evidence of the success of existing programs and services as well as to examine new ways to support workforce development. The Summative Evaluation identified several target groups that were not well represented in the survey data (e.g., Aboriginals, immigrants), clearly highlighting opportunities for additional research. Also noted was a discrepancy between the success rates of EBSMs for active versus former claimants. Because various ways exist to interpret this finding, further research is necessary. That individuals with more tenuous attachment to the workforce, or extended periods of unemployment, require more comprehensive services to facilitate their career success is widely recognized (Conger & Hiebert, 2007; Herr, 2003; Mason & Tereraho, 2007).

EBSMs results also seemed to be affected by geographic location and local labor markets. Community-based organizations throughout the province are, therefore, well-positioned to facilitate research across diverse geographic regions.

The Summative Evaluation (HRSDC, 2004) also noted the lack of strong connections to employers and of recognition by employers that EBSMs were of benefit. This finding is not surprising, given that Service Canada's funding of EBSMs is legislated through the El Act and is regulated to ensure client-centered investments (i.e., benefits were directed toward individuals to facilitate rapid reattachment to a labor market that, at the time, was debilitated by high levels of unemployment). However, a strength of the federal procurement practices for awarding EBSM contracts has been the responsiveness to changing labor markets; in today's economy, defined by a shortage of skilled workers rather than high unemployment, the importance of employers fully understanding how career and employment services can help them meet their workforce requirements must not be underestimated.

This finding suggests an opportunity for service providers to better use their existing connections with employers and serves as an important wake-up call to those providers who have not yet developed employer partnerships of the emerging importance of such connections. In the current economy, which is more "demand-driven" (i.e., focused on the needs of employers) than "supply-driven" (i.e., focused on the unemployed), close links to the employer community are increasingly important. Tracking and articulating the benefits and cost effectiveness of the career and employment services sector's contributions to workforce development (e.g., the effect of employment and career services on employee attraction, engagement, skill development, and retention) has potential to enhance links to employers, especially in labor markets characterized by labor and skill shortages.

Threats

Related to the opportunity to forge stronger links to employers, the Summative Evaluation (HRSDC, 2004) finding that many employers were unaware of career and employment services and, when surveyed, did not view them as relevant to their needs may be perceived as a threat. At the time of the Summative Evaluation research, Employment Assistance Services accounted for 75% of all British Columbian LMDA interventions and were taking an increasingly greater share of the budget (i.e., up from 27% in 1997-1998 to 42% in 2001-2002; HRSDC, 2004). Unfortunately, the Summative Evaluation noted that program participation was not resulting in significant employment status improvements for either active or former EI claimants; clearly, such results represent a threat to continued funding. The Summative Evaluation also reported that, despite introducing Contact IV (a standardized data management system), collecting participant data remained an issue. If career service providers were to conduct comprehensive research using such frameworks as the draft provided by the CRWG (Baudouin et al., 2007), data collection would be essential.

Another threat is more subtle. As reported in the Summative Evaluation (HRSDC, 2004), participants in EBSMs had higher levels of education than did the general unemployed population. However, during the time of the Summative Evaluation, service providers had been mandated to focus on the most employable clients to generate an immediate savings to the EI fund. Therefore, that the Summative Evaluation implied "skimming" on the part of service providers (i.e., selecting only the participants with the highest potential for success) is somewhat surprising.

Further research could paint a different picture; documenting the extent to which equity group members and those with significant barriers to employment are being serviced through existing LMDA Support Measures would be interesting. Such support for uninsured vulnerable populations is an important yet undermeasured benefit of EBSMs. In the current economy, the labor market is defined by low unemployment and high shortages of skilled workers; participants in career programs and services are, in many cases, much needier than before. Employers are reaching out to underrepresented populations, many of whom have multiple barriers to employment. Although it is widely documented that such individuals will require more employment supports and will likely take longer to fully attach to the workforce (Conger & Hiebert, 2007; Herr, 2003; Mason & Tereraho, 2007), if funding and programming decisions are based on the research presented in the Summative Evaluation, unrealistic outcome targets may be set and programs and services may be ill-equipped to provide the necessary levels of support to individuals currently accessing publicly funded services.

SUMMARY, LIMITATIONS, AND A CALL TO ACTION

In summary, unless career service providers and career practitioners can demonstrate both the efficacy and effectiveness of their work, in language understood and accepted by funders, policy makers, and employers, they face a significant risk of losing funding for services widely believed to be important. The CRWG began its research based on a call at the Pan-Canadian Symposium to "show me the evidence" (Baudouin & Hiebert, 2007, p. 128). The CRWG has since drafted a comprehensive framework to guide data collection and program evaluation (Baudouin et al., 2007) that was based on surveys, focus groups, and telephone interviews with diverse stakeholders in the career and employment services sector (Lalande & Magnusson, 2007). It is hoped that this framework can help overcome the debate in the evaluation literature about the pros and cons of CBA in comparison with CEA--many creative ways exist to measure the effects of career programs and services (Conger & Hiebert, 2007; Jarvis, 2003).

As with any brief paper, the scope of this article has limitations. Our goal was to stimulate interest in conducting timely and relevant research within the career services sector and to introduce service providers and policy makers to relevant tools and resources to assist with the task. We acknowledge the regional emphasis, with references to programs and funding changes within our local province. We also recognize that many appropriate ways exist to structure research and evaluations of career programs and services and that the model we are recommending, developed by the CRWG, is still in its draft version. However, we are convinced that timely action is imperative. Rich data can stimulate thoughtful debate and discussions; those discussions can, in turn, equip career practitioners, service providers, policy makers, and funders to make sound, informed decisions.

The evaluation framework proposed by the CRWG provides a starting place for the systematic and intentional collection of that data. All stakeholders in the career and employment services sector share an interest in ensuring that clients receive the services they need to become fully engaged in the workforce and that employers have access to the best workers available. Funding is limited and will be directed toward programs and services that can clearly document successes. Research and evaluation practices influence every facet of this sector--it is time to work together to ensure that evidence is systematically collected and effectively analyzed to inform decisions about programs, services, and funding.

The CRWG draft framework highlights the importance of beginning with the end in mind. Research and evaluation must meet the needs of the end users. Funders mandate data collection to measure variables of interest to them, ensuring accountability for money invested by the government. However, service providers can also collect data that will demonstrate the "soft" outcomes and positive long-term influences of their services to clients. The CRWG framework is flexible enough to accommodate research that has the potential to satisfy and inform the diverse stakeholders in this career and employment services sector. Our hope is that this article inspires more research in the profession.

REFERENCES

America's Career Resource Network. (n.d.). Evaluation. Retrieved October 27, 2007, from http://www.acrnetwork.org/evaluation.htm

Baudouin, R., Bezanson, L., Borgen, B., Goyer, L., Hiebert, B., Lalande, V., et al. (2007). Demonstrating value: A draft framework for evaluating the effectiveness of career development interventions. Canadian Journal of Counselling, 41, 146-157.

Baudouin, R., & Hiebert, B. (2007). Introduction to special issue on evidence-based practice in career development. Canadian Journal of Counselling, 41, 127-129.

Bright, J. E. H., & Pryor, R. G. L. (2005). The chaos theory of careers: A user's guide. The Career Development Quarterly, .53, 291-305.

Conger, S., & Hiebert, B. (2007). Employment and educational equivalence outcomes as measures of employment and career counselling. Canadian Journal of Counselling, 41, 186-193.

Herr, E. L. (2003). The future of career counseling as an instrument of public policy. The Career Development Quarterly, 52, 8-17.

Hughes, D. (2004). Creating evidence: Building the case for career development (Food for Thought Document No. 15). Retrieved October 27, 2007, from the Counsellor Resource Centre Web site: http://www.crccanada.org/crc/files/ Communication_Strategy_No.15_Hughes818_2.pdf

Human Resources and Social Development Canada. (2004). Summative evaluation of employment benefits and support measures under the terms of the Canada/British Columbia Labour Market Development Agreement. Retrieved October 10, 2007, from http://www.hrsdc.gc.ca/en/cs/sp/hrsd/evaluation/reports/ sp-ah-666-04-04/page00.shtml

Jarvis, P. S. (2003). Career management skills: Keys to a great career and a great life (Food for Thought Document No. 8). Retrieved October 27, 2007, from the Counsellor Resource Centre Web site: http://www.crccanada.org/crc/files/ Communication_Strategy_No.8_Jarvis716_2.pdf

Lalande, V., Hiebert, B., Magnusson, K., Bezanson, L., & Borgen, B. (2006). Measuring the impact of career services: Current and desired practices. Retrieved October 27, 2007, from the National Consultation on Career Development Web site: http://www.natcon.org/natcordpapers/natcon_papers_2006_e5.pdf

Lalande, V., & Huston, M. (2005). Developing an accountability framework for career development practices (Food for Thought Document No. 16). Retrieved October 27, 2007, from the Counsellor Resource Centre Web site: http://www.crccanada.org/crc/files/ Communication_Strategy_No.16_Lalande&Huston827_2.pdf

Lalande, V., & Magnusson, K. (2007). Measuring the impact of career development services in Canada: Current and preferred practices. Canadian Journal of Couaselling, 41, 133-145.

Leedy, P. D., & Ormrod, J. E. (2005). Practical research: Planning and design (8th ed.). Upper Saddle River, NJ: Pearson Education.

Mason, G., & Tereraho, M. (2007). Value-for-money analysis of active labour market programs. Canadian Journal of Program Evaluation, 22, 1-29.

Organisation for Economic Co-operation and Development. (2004). Career guidance and public policy: Bridging the gap. Retrieved October 27, 2007, from http://www.oecd.org/dataoecd/33/45/34050171.pdf

Wall, J. E. (2007). Is your program working? Resources for evaluating your career program. Retrieved June 18, 2008, from the Sage Solutions Web site: http://home.earthlink.net/~sagesolutions/ Evaluation%20Resources.pdf

Roberta A. Neault and Deirdre A. Pickerell, Life Strategies Ltd., Coquitlam, British Columbia, Canada. This article is based on a presentation by both authors at the November 2007 Association of Service Providers for Employability and Career Training (ASPECT) conference in Vancouver, British Columbia, Canada. The authors thank ASPECT for partial funding of the project and several reviewers, especially Norma Strachan and Doug Preston, for feedback on earlier drafts. Correspondence concerning this article should be addressed to Roberta A. Neault, Life Strategies Ltd., 2956 Fleet Street, Coquitlam, British Columbia, Canada V3C 3R8 (e-mail: Roberta@fifestrategies.ca).
COPYRIGHT 2008 American Counseling Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2008 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Neault, Roberta A.; Pickerell, Deirdre A.
Publication:Journal of Employment Counseling
Article Type:Report
Geographic Code:1CANA
Date:Sep 1, 2008
Words:4420
Previous Article:Age experience, and learning on the job: crossing the boundaries between training and workplace.
Next Article:College students and the work world.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters