Printer Friendly

The role and function of certified vocational evaluation specialists.

During the second half of the 20th century, the practice of Vocational Evaluation in North America evolved into a distinct professional specialty (Shumate, Hamilton, & Fried, 2004). The profession emerged in response to a demand within the field of vocational rehabilitation for improved vocational assessment measures that did not discriminate against individuals with disabilities. New techniques and instruments were devised that included realistic and in-depth assessment methodology to facilitate the exploration of clients' vocational potential. The use of real or simulated work as primary assessment tools for determining clients' potential for successful performance and adjustment in a specific occupation remain an essential and distinct characteristic of vocational evaluation today (Pruitt, 1986; Thomas, 1999).

Soon after vocational evaluation began to emerge as a separate discipline, a number of research efforts were initiated to identify the specific skills, knowledge and assessment techniques used by vocational evaluators in practice (Egerman & Gilbert, 1969; Nadolsky, 1971; Sankovsky, 1969; Spieser, 1967). However, most of the research describing vocational evaluation, with specific domains of knowledge, role, and function, has occurred since the 1970s (Taylor, Bordieri, Crimando, & Janikowski, 1993).

In one of the initial efforts to define vocational evaluation, Pruitt (1972) utilized a task analysis approach to examine the role and function of 45 vocational evaluators. The results identified 67 job tasks grouped into 7 major job function domains. The seven domains included: (a) evaluation, (b) counseling and interviewing, (c) training, (d) administration, (e) occupational analysis, (f) communication and relating, and (g) research and development. Shortly afterward, The Rehabilitation Services Administration (RSA), the National Rehabilitation Association (NRA), and the Vocational Evaluation and Work Adjustment Association (VEWAA) cosponsored a monumental three-year study entitled the Vocational Evaluation Project (1973-1975). Seven task forces were formed to explore and develop standards for work evaluation services. Confusion ensued regarding the delineation of specific roles and functions of vocational evaluators, primarily due to the diversity of work settings and targeted clientele. The study concluded that there was "... lack of a clear conceptual ideological statement relative to the role and function of the evaluator" (VEWAA, 1975, p.123).

Coffey's (1978) doctoral dissertation may be the earliest comprehensive survey of the essential competencies of vocational evaluators. From an original list of 2,500 vocational evaluator skill, knowledge, and ability statements, Coffey's resulting questionnaire retained 175 items grouped into 9 categories. Asked to rate the 175 statements from essential to most essential on a 5-point rating scale, the response from Coffey's sample of 116 vocational evaluators, educators and graduate students yielded a mean rating of 4 or better on 95 competency statements. In a subsequent analysis of the original data, Coffey, Hansen, Menz, and Coker (1978) identified the top 20 competency statements that received high priority ratings from 92% of the respondents. The highest ratings were in the areas of Working Relations (interagency relationships), Analysis and Synthesis of Evaluation Data, and Communication.

Despite early research efforts delineating specific roles and functions of vocational evaluators, there remained significant divisions between rehabilitation counselors and vocational evaluators regarding their specific scope of practice (Emener, McFarlane, & Parker, 1978). While most authorities on the subject agree that considerable overlap exists between the two professions, vocational evaluation retains distinct core functions and competencies separate from rehabilitation counselors (Sink & Porter, 1978; Sink, Porter, Rubin, & Painter, 1979; Gannaway & Sink, 1979; Sigmon, Couch, & Halpin, 1987).

A complicating factor in distinguishing vocational evaluation from its sibling rehabilitation professions has been role variability. Differences between employment settings (i.e. public, private non-profit, private for-profit, schools) account for wide disparities in the job functions of rehabilitation counselors and vocational evaluators. Leahy, Shapson and Wright (1987), in a nationwide investigation of 1,149 rehabilitation practitioners' role by specialization and employment setting, determined that "evaluators as a group seemed quite different in their perceptions of competency importance in a number of areas" (p. 125). Notable among these were Assessment Planning and Assessment Administration. Therefore, despite wide variance in the job functions that vocational evaluators and rehabilitation counselors perform by employment setting, vocational evaluation requires specific and distinct skills and tasks, a more "restricted range of competence" than either job placement specialists or rehabilitation counselors (p.128). A short time later, Leahy and Wright (1988) found that vocational evaluators were able to rank six distinct areas of competence with unique applications to their profession. These included: (a) Assessment Planning and Interpretation, (b) Vocational Counseling, (c) Assessment Administration, (d) Job Analysis, (e) Case Management, and (f) Personal Adjustment Counseling.

In a study exclusive to vocational evaluators in the State of Florida, Taylor, Bordieri, Crimando, et al., (1993) examined the job tasks and functions in the public, private non-profit and private for-profit employment settings. By way of principal component analysis, six job function factors were derived and included: (a) Vocational Assessment, (b) Matching Individuals to Jobs, (c) Vocational Counseling and Report Development, (d) Situational Assessment, (e) Job Readiness Appraisal, and (f) Computer Application and Administration. Despite differences in terminology, the researchers concluded that their results were consistent with findings from the previous studies by Coffey (1978) and Leahy and Wright (1988).

Later, Taylor and Bordieri (1993) extended their research in a nationwide survey that incorporated a revised version of the instrument used in the Florida study. Despite the fact that that the research design utilized the largest sample of vocational evaluators to date, the effective response rate was low with 526 (18.8%) usable surveys returned. Again, a principal component analysis revealed six general job function domains: (a) Vocational Counseling, (b) Behavioral Observation, (c) Occupational Development, (d) Standardized Assessment, (e) Professionalism, and (f) Case Management.

Since the last role and function study was conducted (Taylor & Bordieri, 1993), the practice of vocational evaluation has continued to evolve and diversify. Technological advancements and a global labor market have shifted the structure and methods of participation in the world of work. Significant changes in practice settings, service delivery systems, client characteristics, and evolving federal legislative mandates (e.g., Rehabilitation Act Amendments of 1998; Ticket to Work and Work Incentives Improvement Act of 2002) have affected the daily practice of vocational evaluation and many other rehabilitation professions (Leahy, Chan & Saunders, in press). Therefore, periodic systematic and empirical examination of the competencies of a profession provides the foundation for professional standards, and certification that assures a degree (e.g., minimum, advanced, master) of competency in a changing professional environment (Henderson, 1996).

In 2002 CCWAVES commissioned the present study to support the certification process through an investigation of the current role and function of Certified Vocational Evaluation Specialists (CVE) in North America. The research design used job analysis methodology as described by Henderson (1996) to identify and validate the important job functions and knowledge domains relevant to vocational evaluation, and to determine if the relative importance of the job functions and knowledge differed significantly across employment settings.

Method

Participants

Participants in this study consisted of a randomized sample of 800 CVE from Canada and the USA selected from the larger CCWAVES database of 1215 CVE. Of the 800 surveys mailed, a total of 586 (73%) surveys were returned. Of the 586 returned surveys, 558 surveys yielded usable data for an effective response rate of 70%.

Demographically, the 558 survey respondents were comprised of 345 (61.8%) females and 213 (38.2%) males ranging in age from 24-74 years (M=47 years). Participants' mean work experience in vocational evaluation was 16 years. Ethnically, the vast majority of participants were Caucasian (91%) followed by African Americans (3.4%), Hispanics (2.7%), Native Americans (0.7%), Asian Americans (0.5%), and other (0.4%).

Two hundred forty-two (43%) of the CVE respondents were self-employed (22%) or employed in private for-profit venues (21%). Eighteen percent (18%) of respondents were employed in state/federal/public agencies, 20% in private not-for-profit agencies, 7% were employed in school based settings, 7% worked in a university or college venue, and the remaining 2.7% reported employment in the other category. Occupational titles of participants included 30% vocational evaluators, 18% adminstrator/ manager, 15% vocational consultants, 14% rehabilitation counselors, 8.5% case manager, 3.8% university educator, 3.4% other, 1.8% school transition specialist, 1.3% occupational therapist, 1.3% rehabilitation psychologist, 0.9% career assessment specialist, and 0.4% work adjustment specialist.

From an educational perspective, nearly three-quarters (73%) of CVE respondents possessed a Masters degree. Other degrees held by participants included 17.6% at a Bachelors degree, 7.3% possessed Doctoral degrees, 2% held an Ed. S. degree and 0.5% of respondents held an Associate degree. Asked to identify any/all major areas of study within their degree(s) only 17% of participants reported vocational evaluation as a major area/emphasis/concentration in their degree.

Instrumentation

The Vocational Evaluation Job Task and Knowledge Inventory (VE-JTAKI) was developed specifically for this study. Three sources were used in the construction and validation of the instrument: (a) a review of the literature in rehabilitation and vocational evaluation, (b) adoption of items from two previous instruments (Newman, Waechter, Nolte, & Boyer-Stephens, 1998; Taylor & Bordieri, 1993), and (c) the results of a Delphi procedure.

Initially to identify important job tasks and knowledge areas in contemporary vocational evaluation practice, a role delineation study queried 16 content experts in a Delphi technique, Designed as a consensus building approach using a series of iterations, the Delphi technique seeks to obtain agreement among the experts (Linestone & Turoff, 1975) on the important skills, abilities and knowledge areas required for competent vocational evaluation practice. Based on the qualitative and quantitative results of the Delphi procedure, the final VE-JTAKI questionnaire contained 85 job task items rated on two separate, but parallel, 5-point Likert-type rating scales: importance and frequency. Responses on the scales ranged from 1, indicating Not Important or Not at All, to 5, indicating a rating of Extremely Important or Most of the Time. Similarly, the VE-JTAKI contained 55 knowledge area items, also rated on two separate, but parallel, 5-point Likert-type rating scales: importance and preparedness. Responses on these scales ranged from 1, indicating Not Important or No Preparation, to 5, indicating Extremely Important or Very High Degree of Preparation.

Results

Data Analysis

Data from the VE-JTAKI results were analyzed using a type of factor analysis termed Principal Component Analysis (PCA). Used to identify underlying patterns that allow data to be condensed or summarized into a smaller set of factors (components), the results of factor analyses explain variability and the correlation structure of the items. Two separate, but identical analyses were conducted. The first PCA used importance ratings of the 85 job tasks to derive the major job functions, and the second PCA used the importance ratings of the 55 knowledge items to derive major knowledge domains.

General procedures employed in both PCAs included the Kaiser-Guttman criterion (i.e., eigenvalues greater than 1) used to generate initial factor solutions, followed by a review of Cattell's scree plot of eigenvalues as an alternate method to determine the number of interpretable factors to be retained (Leahy et al., in press). Retained factors were rotated by both VARIMAX orthogonal and OBLIMIN oblique methods to identify the most parsimonious solution yet meaningful interpretation. Factor loadings equal to or greater than .40 in the rotated structure were retained and included in the interpretation of the factor solutions. Factor loading criterion was selected based on Tabachnick and Fidell's (2001) rule of thumb "... only variables with loadings of .32 and above are interpreted" (p. 625).

Major Job Functions

To identify the major job functions of vocational evaluation, the 85 job task items of the VE-JTAKI were submitted to a Principal Component Analysis (PCA). Based on principles of parsimony, ease of labeling, and meaningful interpretation, the superior solution was a VARIMAX 6-factor solution, which accounted for 49.05% of the total variance.

All 85 job task items loaded on at least one of the six factors and some items loaded on more than one factor; a total of 76 job task items were retained for use in the final interpretation of the 6 job functions. Reliability coefficients (Cronbach's alpha) were calculated on the retained items and ranged from moderately high (.68) to very high (.94) internal consistency for the 6 factors (see Table 1). Examination of the individual factor structure of the job functions resulted in the following factor labels and subsequent interpretation of the factors.

Factor 1--Clinical Skills to Analyze and Synthesize Assessment Data

This factor includes 16 job task items that address the clinical skills necessary to integrate and interpret assessment and/or evaluation data. A primary emphasis of this factor is the process of individualized assessment with a focus on client involvement, identification of client strengths and needs, and integration of data to result in prioritized recommendations directly related to the world of work. Respondents mean importance ratings of the items ranged from 4.23 to 4.66.

Factor II--Behavioral Observation and Evaluation Techniques

This factor includes 16 job task items that focus specifically on applied assessment and evaluation methods such as systematic observation and recording of behavior, adaptation/modification and/or integration of assistive technology within the vocational evaluation process. Primary evaluation methods/techniques in this job function used work as the focal point of assessment and included work samples, standardized instruments, and situational or community-based assessments. Respondents mean importance ratings of the items ranged from 2.95 to 4.15.

Factor III--Case Management

This factor is comprised of 16 job task items related to the provision of services such as coordination of assessment venues (e.g., situational or community-based sites), worksite assessment, client education, individualized evaluation planning, assessment of community living skills, job seeking/keeping skills, collaboration with referral sources, employers, and other professionals (e.g., clarify referral questions, obtain additional information or feedback, staffing, consultation, promotion/education of assessment services), and appropriate follow-up services. Respondents mean importance ratings for the items ranged from 3.13 to 4.18 for all respondents.

Factor IV--Occupational Analysis

This factor includes 11 job task items related to the research and analysis of occupational requirements, work environments, and/or education/training programs. Analysis may be conducted using published occupational information to identify worker traits, transferable skills, wage/salary information, job/task analysis, and/or computerized job-matching systems. Research functions include professional literature (e.g., business, education, medicine, vocational rehabilitation) review to identify emerging business trends and skill-sets, individualized labor market research, expert testimony/opinion regarding employability, and documentation of findings suitable for legal records. Item mean importance ratings ranged from 3.18 to 4.39 for all respondents.

Factor V--Vocational Counseling

This factor is comprised of 11 job task items related to the use of counseling skills necessary to develop a collaborative working relationship between the client and evaluator and to facilitate discussion of psychosocial, cultural, disability, adjustment, or other issues, which may contribute to functional limitations and impede vocational success. Career counseling including vocational exploration, planning and identification of interventions, community resources, and supports to enhance client employability and/or overall quality of life were included in this factor. Respondents mean importance ratings of the items ranged from 3.66 to 4.39.

Factor VI--Professionalism

This factor included six job task items related to professional identity and practice. Professional practice applications included adherence to ethical and legal principles of the profession (code of ethics), following guidelines of certification and/or accreditation, and attainment of professional development. Items related to professional identity included the use of professional terminology appropriate to the intended audience, maintaining a professional demeanor with clients and colleagues, and awareness and advocacy of the social and environmental barriers that impact persons with disabilities. Respondents mean importance ratings of the items ranged from 4.40 to 4.89.

CVE also rated how frequently they performed each job task. All job functions were rated at or above a rating of 3.00, defined as somewhat frequently. Professionalism (M=4.45) was the most frequently performed job function followed by Clinical Skills to Analyze and Synthesize Assessment Data (M=4.18), Vocational Counseling (M=3.79), Occupational Analysis (M=3.49), Case Management (M=3.21) and Behavioral Observations and Evaluation Techniques (M=3.01). Identical rank ordering of job functions on both the importance and frequency scales suggests the importance of job tasks are strongly related to the frequency of use.

Major Knowledge Domains

To identify the major knowledge domains of vocational evaluation the 55 knowledge items of the VE-JTAKI were submitted to a Principal Component Analysis (PCA). Based on principles of parsimony, ease of labeling, and meaningful interpretation, the superior solution was a VARIMAX 6-factor solution accounting for 58.2% of the total variance.

All 55 knowledge area items loaded on at least 1 of the 6 factors. Two items that loaded at less than the predetermined criteria of .40 were extracted; 53 items were retained for the final interpretation of the 6 knowledge domain factors. Reliability coefficients (Cronbach's alpha) were calculated on retained items and ranged from moderately high (.75) to very high (.94) internal consistency for the 6 factors (see Table 2). Examination of the individual factor structure of the knowledge domains resulted in the following factor labels and subsequent interpretation of the factors.

Factor I--Foundations of Vocational Evaluation

This factor included 20 knowledge items related to the theory and methods common or unique to the vocational evaluation process. Methods included work samples, standardized instruments, and situational and community based assessments and the use of systematic behavioral observation and recording. Theory-related items included principles of learning, concepts of work performance and behavior, theory of work samples and systems, behavioral modification and work adjustment theory, principles of teaching/training, adaptation and/or modification of evaluation methods, and general career development theories. Process related items included individualized evaluation planning, knowledge of service delivery systems and community resources, identification of employer or workplace needs, impact of cultural, disability and ecological variables on employability, and program evaluation and research. Respondents' mean item importance ratings ranged from 3.59 to 4.33.

Factor II--Standardized Assessment

This factor included 10 knowledge items related to selection, administration, analysis, and interpretation, as well as the legal and ethical use of standardized assessment instruments. Knowledge of the principles and limitations of psychological measurement, the statistical concepts of validity and reliability, vocational evaluation philosophy and process, multiple and emotional intelligences, and triangulation of assessment techniques were also emphasized in this factor. Mean importance ratings of the items ranged from 3.63 to 4.45 for all respondents.

Factor III--Occupational Information

This factor included nine knowledge items related to various sources of occupational information used to classify, analyze, or research occupations and/or education and training program requirements. Knowledge of standardized occupational resources (e.g., DOT, O*NET, GOE, NOC), transferable skills analysis, job/task analysis, job development and placement, benefit systems (e.g., SSDI), functional skills assessment, individualized labor market research, and forensic applications were identified as important underlying knowledge areas. Respondents mean importance ratings of the item ranged from 3.72 to 4.34.

Factor IV--Implications of Disability

This factor included five knowledge items focusing on the functional limitations and/or vocational implications of disability such as medical, psychosocial, psychological/psychiatric, pharmacological, and social/ecological aspects of disability. Mean importance ratings of the item ranged from 3.69 to 4.50.

Factor V--Communication

This factor included five knowledge items related to verbal and written communication to convey information and evaluation results. Knowledge related to vocational interviewing, counseling techniques, development of individualized recommendations, and evaluation report development were identified in this factor. Respondents mean importance ratings of the items ranged from 4.33 to 4.63.

Factor VI--Professional Networking and Coordination

This factor included four knowledge items related to developing effective working relationships across or within disciplines. Items included basic negotiation and mediation techniques as well as principles of case management/coordination and computer literacy/applications. Importance item means ranged from 3.54 to 4.13 for all respondents.

CVE also rated each knowledge domain on their perceived degree of preparedness attained through education and/or training. Based on mean preparedness ratings, Communication (M=4.17) was rated with the highest degree preparedness, followed by Standardized Assessment (M=3.92), Implications of Disability (M=3.69), Occupational Information (M=3.54), Foundations of Vocational Evaluation (M=3.52) and Professional Networking and Coordination (M=3.29).

Variation of Importance Ratings Across Employment Settings

To investigate the variance of importance ratings of CVE job functions and knowledge domains across employment venues, multivariate analyses of variance (MANOVA) were used. Employment settings were grouped into the following six categories:

1) State/Federal/Public; 2) Private Not-for-Profit; 3) Private For-Profit; 4) School Based Facility; 5) University/College; and 6) Self-Employed.

Job Functions

Based on the mean factor scores of the job functions, a MANOVA was computed to test for differences between CVE importance ratings among the six employment settings. A significant multivariate F (Wilks' Lambda =. 64, F (30, 2090) = 8.13, p < .05) was found. Univariate analyses of variance (ANOVA) were computed for each of the dependent variables to identify specific differences between the mean scores of the job functions. The results of the ANOVA analyses revealed that significant differences existed on 4 of the 6 job functions.

Follow up Bonferroni pair wise comparisons revealed significant differences in mean score importance ratings on Factor II, Behavioral Observation and Evaluation Techniques (Behavior Observation). CVEs in university (M = 3.93), state/federal/public (M = 3.78) agencies, school-based (M = 3.85), and private not-for-profit (M = 3.66) settings rated Behavior Observation as more important than CVEs in private for-profit (M = 3.23). CVEs in university (M = 3.93) and state/federal/public (M = 3.78) agencies rated this factor as more important than CVEs in self-employed (M = 3.45) settings. Behavioral Observation appears to be relatively less important in both the private profit generating (self-employment and private for-profit) employment settings.

CVEs in state/federal/public (M = 3.98) agencies rated Case Management as more important than CVEs in self-employed (3.66) and the private for-profit (3.60) settings. The job function of Case Management is viewed as less important in both the private profit generating settings. School-based CVEs (M = 3.55) rated Occupational Analysis as less important than CVEs in self-employed (M = 4.18), private for-profit (M = 4.05), and university (M = 3.98) settings. CVEs in state/federal/public (M = 3.87) agencies rated this factor as less important than CVEs in self-employed (M = 4.18) settings; CVEs in private not-for-profit (M = 3.63) settings rated Occupational Analysis as less important than CVEs in self-employed (M = 4.18) and private for-profit (M = 4.05) settings; private not-for-profit (M = 3.63) and school-based (M = 3.55) CVEs rated this factor as less important than CVEs in private for-profit (M = 4.05) settings; CVEs in state/federal/public (M = 3.87) agencies, private not-for-profit (M = 3.63), and school-based (M = 3.55) settings rated the Occupational Analysis function as less important than CVEs who were self-employed (M = 4.18). As might be predicted, when compared to CVEs across the various public and private not-for-profit employment settings, CVEs in private profit generating settings and university-based CVEs perceive Occupational Analysis as a more important job function.

Knowledge Domains

Based on the mean factor scores of the knowledge domains, a MANOVA was computed to test differences in the importance ratings of the CVEs among the six employment settings. A significant multivariate F (Wilks' Lambda = .68, F (30, 2094) = 6.86, p < .05) was found. Univariate ANOVA were computed for each of the dependent variables to determine the specific differences between the mean scores of the knowledge domains. The results of the ANOVA analyses revealed significant differences in mean scores existed on four of the six knowledge domains.

Follow-up Bonferroni pair wise comparisons revealed CVEs in state/federal/public (M = 4.11) settings rated knowledge of the Foundations of Vocational Evaluation as more important than CVEs in private for-profit (M = 3.76) settings. Practitioners who were self-employed (M = 4.25) and employed in private for-profit (M = 4.24) settings rated knowledge of Occupational Information as more important than CVEs in private not-for-profit (M = 3.82) and school-based (M = 3.73) settings. CVEs in private profit generating settings perceive knowledge of Occupational Information as more important than their counterparts in private not-for-profit and school based settings. CVEs employed in school based settings perceived knowledge of the Implications of Disability as relatively less important than CVEs in state/federal/public agency settings.

To summarize, the principal component analysis of the job task items identified six major job functions as important to vocational evaluation practice. This six-factor solution accounted for 49% of the total variance. Certified vocational evaluation specialists (CVE) rated all six job functions as important with each function rated at or above moderately important (> 3.00). In terms of relative importance, based on mean ratings, Professionalism (M=4.57) was perceived as the most important job function. Clinical Skills to Analyze and Synthesize Assessment Data (M=4.45) and Vocational Counseling (M=4.14) were rated between extremely important and very important. Occupational Analysis (M=3.90), Case Management (M=3.77) and Behavioral Observations and Evaluation Techniques (M=3.57) were rated between moderately important and very important (see Table 3).

In addition to the six major job functions, a principal component analysis identified six major knowledge domains important to vocational evaluation practice. The six-factor solution accounted for 58% of the total variance of knowledge items. Certified vocational evaluation specialists (CVE) rated all six knowledge domains as important with each domain rated at or above moderate importance (3.00). Based on the mean ratings, in terms of relative importance, Communication (M=4.48) was perceived as the most important knowledge domain. Implications of Disability (M=4.16), Standardized Assessment (M=4.16) and Occupational Information (M=4.06) were rated between extremely important and very important knowledge domains. Professional Networking and Coordination (M=3.92) and Foundations of Vocational Evaluation (M=3.91) are rated on the extreme high end of moderately important, closer to a very important rating (see Table 4).

Discussion

Overall, the research findings are consistent with previous research efforts (Coffey, 1978; Pruitt, 1972; Taylor & Bordieri, 1993; Taylor, Bordieri, Crimando, et al., 1993) that describe a similar core of common skills and knowledge required by all vocational evaluators. Criticisms of previous research have included low response rates and small or narrow (nonnational) samples. The noteworthy sample size and response rate (70%) of this research study coupled with the content validation of the Delphi study lends confidence to the findings.

Over the years, the role and function of vocational evaluators has evolved and become increasingly well established. Initially, core job functions ranged between seven to nine factors (Coffey, 1978; Pruitt, 1972), but more recently have stabilized at six factors as evidenced in this study and are consistent with other more recent findings (Leahy & Wright, 1988; Taylor & Bordieri, 1993). Each consecutive research investigation further defines and clarifies the important job functions. Unique to the current study are the new empirically based descriptions of the major domains of knowledge that provide the foundation to vocational evaluation practice. Empirical validation of both the major knowledge domains and job functions provides a firm basis to validate the essential competencies necessary for all vocational evaluators.

Comparison to Previous Research

Many studies of vocational evaluation have built upon previous research, and this study is no exception. Similarities exist between the current study and the research efforts from the past 30 years. However, the following discussion primarily focuses on a comparison of this study to the earlier Taylor and Bordieri (1993) national role and function study.

Demographically, many similarities in age, gender, employment setting, and educational attainment exist between this study and the national study by Taylor and Bordieri (1993). Respondents in both studies averaged early to mid-40 years of age and were over 60% female practitioners. While approximately 70% of respondents in both studies possessed a Masters level education, the two studies combined had less than 25% of respondents who identified vocational evaluation as their degree major/emphasis.

Of particular note between the two studies are the employment settings of respondents. The current study found over 43% of CVEs employed in private profit generating (i.e., private for-profit and self-employed) venues, with less than 20% in private not-for profit (19.9%) and 8.2% in school facilities. In comparison to 1993 where the greatest majority (33%) of respondents were based in public schools, only 22.9% employed in private for-profit settings and 29.6% in private non-profit settings. In the 10 years between the studies, the number of vocational evaluators in diverse private sector settings has doubled.

Major Knowledge Domains and Job Functions

The six job functions are consistent with the results of the 1993 study, and five of the six job functions between the studies are highly similar in assigned descriptive labels and job task items. The similarities occur on the following job functions: Behavioral Observation, Case Management, Occupational Analysis, Vocational Counseling, and Professionalism.

An essential difference between the studies occurs with the 1993 job function labeled Standardized Assessment (Factor IV) that included tasks related to the selection, scoring, interpretation, and use of statistical concepts common to standardized tests and instruments. In the current study, these types of job tasks did not combine to form a major job function but were infused primarily throughout the first two factors: Factor I, Clinical Skills to Analyze and Synthesize Assessment Data (Clinical Skills), and Factor II, Behavioral Observation and Evaluation Techniques (Behavioral Observation). The Clinical Skills job function encompassed those skills and tasks necessary to analyze and synthesize the data gathered during the vocational evaluation process and provide accurate and meaningful interpretation of the data. The Behavioral Observation job function included items that focus on applied assessment and evaluation methods including work samples, standardized instruments, observation and recording of behavior, and situational and community based assessment.

From a statistical perspective, the Clinical Skills function is deemed critical as it represents the first factor extracted in the PCA explaining 31% of the total 49% of variance and demonstrating very high ([R] =

.94) internal consistency. In terms of relative importance, second only to the job function of Professionalism (M = 4.57), CVE practitioners rated the Clinical Skills job function as Very Important to Extremely Important (M = 4.45) with usage ranging from Very Frequently and Most of the Time (M = 4.18).

Other research efforts have also documented areas comparable to the Clinical Skills function. Coffey's (1978) cornerstone role and function study labeled one of the nine competency categories, Analysis and Synthesis of Evaluation Data. In Leahy and Wright's (1988) findings, vocational evaluators Were unique in their perceptions of competency in the Assessment Planning and Interpretation area. Clarified and empirically validated in this study, those clinical skills necessary to analyze and synthesize assessment data have evolved into a major and essential job function of vocational evaluators.

Factor II, Behavioral Observation and Evaluation Techniques, concurs with previous findings that suggest evaluation methods such as work samples, situational and community-based assessments are not distinct entities (Taylor & Bordieri, 1993). Instead, the assessment methods and instruments combine to form a broader job function encompassing many types of evaluation techniques with behavioral observation and recording as the unifying function. Stated another way, it is not so much the method or the assessment venue (environment) that is the unique feature of this job function, rather it is the systematic observation and recording of an individual performing the work or the work activities that link the various assessment types to this job function.

Differences in relative importance and frequency ratings of job functions were also noted between the studies. In the 1993 study, Vocational Counseling was extracted as Factor I, and respondents ranked it as the most important and most frequently used job function. In the current study, the Vocational Counseling job function was extracted as Factor V, and it was ranked third in terms of relative importance and frequency of use. The shift in relative importance should not be misinterpreted to infer that vocational evaluators do not require or use vocational counseling skills. Rather, the change in the importance and frequency ranking of Vocational Counseling is likely an artifact of the items and the definition of Vocational Counseling, as well as a sample that is more specifically representative of vocational evaluators than other samples in the earlier research.

Differences Between Employment Settings

In this study, differences in importance between practice settings appear to reflect the particular emphases and objectives directly related to the employment setting (i.e., philosophy, mission, client population served, and expected outcomes) rather than distinctions in the fundamental knowledge and skills required of the vocational evaluator. In general, the significant differences occurred between vocational evaluators employed in private for-profit settings driven by cost containment and their counterparts in the public agencies, schools, and/or private not-for-profit settings. Between the two sectors, differing priorities are recognized. As a result, different assessment methodology is used in the vocational evaluation process. Between the two sectors, job function Factor IV, Occupational Analysis, and knowledge domain Factor III, Occupational Information, consistently demonstrated significant differences in importance and frequency ratings. CVE in private sectors rated pragmatic, cost-effective, and less time intensive evaluation methods (e.g., transferable skills analysis, job matching, labor market research) related to Occupational Analysis and/or Occupational Information as more important and more frequently used than their counterparts in schools, public agencies, and/or private not-for-profit settings. Conversely, CVE in public agencies, school and university facilities, and private not-for-profit settings generally rated the more time intensive comprehensive evaluation methods as more important and more frequently utilized.

The focus and time spent on specific work duties varies between employment settings, and these findings offer some insight to how the variation transpires in actual practice environments. These results support earlier rehabilitation research that has consistently found organizational factors, such as mission and/or philosophy, to be potent mediators of job performance, emphasis, or time spent on any particular job function (Feinberg & McFarlane, 1979; Leahy, 1986; Taylor & Bordieri, 1993; Taylor, Bordieri, Crimando, et al., 1993).

Conclusions

Building on the research efforts of the past 30 years, the job functions and knowledge domains identified in this research study confirm that vocational evaluation has evolved into a profession with a stable core of common skills and competencies required of all vocational evaluators, The importance of job functions and knowledge domains significantly differed for some employment settings, most notably between private sector and those more traditional settings in public, not-for profit, and school facilities. Despite the variation, the role and function of vocational evaluators across employment settings is more similar than different, varying in degrees of importance rather than in actual functioning.

The findings of this study serve to further solidify and delineate contemporary CVE job tasks and functions and provide the first empirically based descriptions of the knowledge domains underlying vocational evaluation practice. In combination, they provide the foundation to define the critical competencies necessary for the effective job performance of a vocational evaluator. Designed to support CCWAVES and grounded in the model of job analysis (Henderson, 1996), the research findings provide the empirical basis that outline content valid job-related practices and the foundation for defensible CVE examination development.

However, the findings and recommendations from this study extend beyond CCWAVES examination development. Vocational evaluation educators at the preservice level can use this research data to evaluate and update curricula to assure comprehensive course coverage of the six major knowledge domains. The emphasis in curriculum and professional development training for vocational evaluators must be competency-based rather than content-based. Consistent with a growing trend in education and congruent with the underlying philosophy of vocational evaluation, graduate education programs should examine and consider implementing models of authentic assessment which include performance-based portfolios and/or competency matrices.

Ultimately, competencies must be demonstrated in a job context. The occupational realities for 21st century vocational evaluators include diverse settings beyond the traditional public sector applications. Almost half (43%) of the CVEs in this study were employed in private for-profit settings, double the proportion since 1993. Graduate education programs must address the variation of important job functions and knowledge domains across employment settings. Professionally prepared graduates should not only possess the core knowledge, skills, and abilities (KSA), but they also must be able to implement those KSAs in specific practice environments. To maximize employment opportunities and insure that graduates can adequately demonstrate skills in a variety of job contexts, vocational evaluation graduate programs are strongly encouraged to develop specific concentration or focus areas that emphasize practice applications representative of specific employment settings.

Although all six knowledge domains were identified as important, critical to the preparation of competent vocational evaluation practitioners are those underlying competencies related to knowledge of the Foundations of Vocational Evaluation. Forty percent (40%) of a total 58% of variation can be explained by knowledge encompassed in Factor I, Foundations of Vocational Evaluation domain. Specific items in this domain are directly related to the theory and methodologies unique to vocational evaluation. Coupled with the supporting competencies developed from Standardized Assessment (Factor II), these particular knowledge domains appear to represent a large proportion of the underlying competencies required for effective performance in the identified job functions of Clinical Skills to Analyze and Synthesize Assessment Data (Factor I) and Behavioral Observation and Evaluation Techniques (Factor II).

The paucity of available course work in vocational evaluation offering curricula in these critical knowledge areas unique to vocational evaluation continues to challenge and undermine both professional and certification efforts. In this study, only 17% or 93 of the total 558 CVE respondents reported an educational degree with a major emphasize or concentration in vocational evaluation. Graduate education programs offering an emphasis in vocational evaluation are dwindling (Thomas, 1996). The 10 existing graduate programs are tied to more general degrees (e.g., rehabilitation, special education) programs and vary greatly in the number courses, enrollees, graduates, and faculty expertise. While many of the 10 programs are comprehensive and offer several distinct courses in vocational evaluation, others offer a restricted program that includes 1 or 2 generic assessment courses designed to "meet" most of the specific vocational evaluation knowledge domains (Hamilton, McDaniel, Leconte, & Stewart, 2003).

Eighty-three percent (83%) of CVEs in this study cited short-term, on-the-job, and/or in-service training as sources of training specific to vocational evaluation. Although far from ideal, in the past, vocational evaluators who lacked formal education were able to reduce knowledge gaps through short and long-term training endeavors. Unfortunately, the once plentiful Rehabilitation Services Administration (RSA) training dollars available for short- and long-term training have diminished. This reduction in training funds exacerbates an already serious problem; a lack of opportunity to provide even the most basic professional development training to vocational evaluators with limited formal education (Thomas, 1996). The long-standing deficiency of available education and training stymies the discipline and further contributes to some existing perceptions that the role of a vocational evaluator is one of a technician. Vigorous efforts are urgently needed to advocate and secure on-going short- and long-term sources of training for the vocational evaluation profession.

These research findings provide fertile ground for educators and professional organizations to focus their efforts toward the standardization of education and training. As a guiding document, these findings may stimulate additional discussion in such areas as: (a) redefinition of CCWAVES eligibility criteria to satisfy acceptable course work in vocational evaluation; (b) review of vocational evaluation programs funded by RSA to insure adequate coverage of competencies/content specific to vocational evaluation, as well as a commitment to meet the short-term and long-term training needs of the profession; and (c) reconsideration by the Council on Rehabilitation Education (CORE) in the development of specialization tracks in vocational evaluation.

Characterized by distinct roles within diversified settings, vocational evaluators are no longer solely nested within traditional rehabilitation settings. Their increased presence in the private sector settings confirms Thomas' (1999) predictions that vocational evaluation in the 21st century would gain professional independence and recognition as a service that has value to society. Vocational evaluation has matured and established itself as a unique and legitimate profession with distinct job functions and knowledge domains. An essential hallmark of a profession is a legally defined and enforceable scope of practice (Nugent, 1981), and the findings of this research assist the vocational evaluation profession to this end.
Table 1
Job Tasks Per Factor Importance: Mean, SD, Alpha
(N = 549)
 Mean SD [alpha] *

Factor 1: Clinical Skills to Analyze &
Synthesize Assessment Data 4.45 .63 .94

 Clearly identify client's vocational
 strengths and needs in the vocational
60 evaluation report. 4.66 .74

 Interpret evaluation results to client
 emphasizing the relationship between the
81 results and the world of work. 4.46 .89

 Relate evaluation results to the needs
 of the client, referral source,
 educational institutions and the labor
61 market. 4.47 .85

 Select and administer standardized
 tests/instruments such as interest,
 aptitude, values, temperament,
 achievement, dexterity, and/or learning
45 style 4.43 .94

 Interpret evaluation results to client
 emphasizing the relationship between the
54 results and world of work. 4.37 .93

 Analyze, synthesize, and interpret
80 evaluation results. 4.65 .74

 Proficiently score, norm, and interpret
55 standardized tests/instruments. 4.47 .97

 Individualized and prioritize
 recommendations based on needs of
58 client. 4.51 .75

 Obtain feedback from client regarding
 their understanding of evaluation
56 information. 4.24 .98

 Relate evaluation results to
 occupational groups and/or specific jobs
 and identify required education and
57 training. 4.26 .97

 Clarify with client the purpose and
 benefits of the vocational evaluation
37 and discuss mutual expectations. 4.37 .93

 Incorporate client's interests,
 abilities and needs to identify career
 goals and/or generate career or training
14 alternatives. 4.60 .80

 Provision of effective and timely
 vocational evaluation services (e.g.,
 admission, scheduling report
31 submission). 4.43 .87

 Recognize limitations of standardized
 tests/instruments for use with specific
 populations (e.g., culturally diverse,
51 individuals with disabilities. 4.32 .96

 Present client with inconsistencies
 between stated vocational goals and
46 demonstrated behaviors. 4.23 .95

 Conduct initial vocational interview
 7 with client. 4.63 .84

Factor 2: Behavioral Observation & Evaluation
Techniques 3.57 .84 .92

 Select and administer appropriate work
29 samples. 3.59 1.47

 Use behavior observation scales and
 techniques (e.g., time sampling, point
47 sampling) with work samples. 3.11 1.34

 Use systematic behavioral observations
 techniques to describe, record and
 interpret client work performance and
 behavior during work sample, situational
 5 or community-based assessment. 3.85 1.36

 Adapt or modify work samples and/or 3.59 1.39
 other evaluation techniques to
40 facilitate client performance.

 Develop work samples (or less formal
 work tasks) based on local labor market
 and/or vocational training
79 opportunities. 2.95 1.32

 Communicate modifications to and/or
 limitations of standardized assessment
 3 results. 3.46 1.21

 Develop rating forms and checklists to
 be used with client during situational
34 or community-based assessment. 3.06 1.38

 Evaluate standardized tests/instruments
 for reliability, validity and
48 appropriate norm groups. 3.60 1.28

 Provide in-service training to school or
 agency personnel on vocational
74 evaluation services. 3.33 1.23

 Adapt standardized instruments to the
49 individual needs of client. 3.72 1.28

 Research and update standardized tests/
 instruments (e.g., replace outdated and/
50 or obsolete tests). 3.81 1.19

 Select evaluation techniques based on
13 referral questions. 4.15 1.08

 Use multiple measures (triangulate)
 during evaluation to compare and
 identify abilities or inconsistencies
 (e.g., interest testing and manifest
43 interests observed during evaluation). 4.03 1.12

 Articulate advantages and disadvantages
 of different approaches to vocational
 assessment (e.g., levels of assessment
 2 and techniques used in each). 3.16 1.22

 Interpret statistical concepts
 associated with standardized tests/
 instruments (e.g., mean, percentile,
 standard score, standard error of
25 measurement). 3.78 1.16

 Incorporate general assistive technology
 into evaluation to facilitate client
71 performance. 3.91 1.11

Factor 3: Case Management 3.77 .74 .91

 Conduct situational or community-based
 assessment opportunities to observe
 specific client work performance and
 7 behaviors. 3.41 1.32

 Demonstrate appropriate follow up
85 procedures. 4.05 1.02

 Conduct worksite assessment to identify
83 job accommodations or modifications. 3.72 1.21

 Collaborate with other rehabilitation
 providers to effectively coordinate
 services in an appropriate and timely
65 manner. 4.01 1.12

 Aide client to prioritize various needs
 such as type of work, environment, wages
63 and benefits desired. 3.86 1.05

73 Use job analysis to select or develop a
 work sample. 3.13 1.31

 Conduct staffing with client and
 referral source (incorporating family
 and/or other members of
 interdisciplinary team where
62 appropriate). 3.93 1.19

 Educate client regarding rights related
 to vocational assessment according to
68 state and federal law. 3.56 1.27

 Assess client's independent living
75 skills as they relate to employability. 3.65 1.20

 Assess specific work ecology (e.g.,
 performance demands, supervision style,
 social demands) of employment
 environment to determine fit with client
42 needs and abilities. 3.86 1.10

 Use principles of leaming theory to
 develop educational and vocational plans
 with client (e.g., learning style
84 assessment). 3.68 1.17

 Evaluate client's job seeking (e.g.,
 interviewing skills) and job keeping
76 (e.g., punctuality) skills. 4.18 1.00

 Establish effective professional
 relationships with relevant business,
35 community and agency entities. 3.94 1.08

 Promote vocational evaluation as a
 profession to government and private
72 organizations. 3.76 1.24

 Demonstrate personal stress management
26 skills. 3.52 1.25

 Develop an individualized vocational
 evaluation plan to guide and document
69 evaluation activities. 4.04 1.21

Factor 4: Occupational Analysis & Information 3.90 .68 .85

 Conduct labor market research/analysis
 to determine existing jobs consistent
 with client's skills, abilities,
 1 interests and limitations. 3.69 1.18

 Identify wage and salary information for
21 various jobs. 3.82 1.09

 Identify transferable skills by
 analyzing client work history and
23 functional assets and limitations. 4.30 .92

 Provide expert opinion or testimony
 regarding employability and
22 rehabilitation feasibility of client. 3.54 1.36

 Use published occupational information
 such as GOE, O*NET, DOT, NOC (Canada)
 for career exploration and/or to
82 classify local jobs. 4.30 .95

 Conduct job analysis and/or task
 4 analysis. 3.45 1.19

 Adequately document all evaluation
 findings sufficient for legal testimony
52 or legal records. 4.39 .99

 Keep current on emerging trends in labor
 market (e.g., skill sets, workplace
28 trends, career technical programs). 4.25 .90

 Use computerized job-matching systems to
 assist with job placement
19 recommendations. 3.18 1.25

 Compare occupational worker traits to
20 client's work skills and abilities. 3.98 1.04

 Review professional literature related
 to business, labor markets, medicine,
 education, rehabilitation and vocational
 evaluation and apply to professional
30 practice. 3.99 .93

Factor 5: Vocational Counseling 4.14 .63 .87

 Identify community resources and
 supports available to client to enhance
 employability and/or overall quality of
10 life. 4.33 .88

 Discuss client's vocational plans when
16 they appear inappropriate. 4.39 .88

 Discuss with client relevant
 psychosocial issues (e.g., family
 influences, adjustment) that may impede
18 or support vocational success. 3.96 1.04

 Incorporate culturally diverse
 approaches into vocational planning and
11 decision-making. 3.66 1.14

 Facilitate exploration of vocational
 alternatives by recommending educational
 6 or occupational materials. 3.95 1.03

 Identify psychological (e.g.,
 depression, suicidal ideation) issues
 8 requiring referral or consultation. 3.92 1.11

 Determine level of intervention
 necessary for job placement (e.g., job
 club, supported employment, on-the-job
38 training). 4.16 1.07

 Identify and recommend functional or
 skill remediation services to enhance
36 client's successful job placement. 4.23 .96

 Create a collaborative environment
 involving the client in the decision-
27 making process. 4.34 .93

 Recognize evidence of secondary
 disabilities not previously identified
 (e.g., substance abuse or learning
32 difficulty). 4.30 .88

 Gather, analyze and interpret referral
12 and biographical data. 4.29 .93

Factor 6: Professionalism 4.57 .43 .68

 Adhere to ethical and legal principles/
41 practices of profession. 4.89 .42

 Use professional terminology appropriate
77 to intended audience. 4.29 .91

 Follow guidelines of relevant
 certification and/or accrediting bodies
33 (e.g., CCWAVES, CARF, OHSA). 4.44 .92

 Maintain a professional demeanor with
 9 clients, staff and other professionals. 4.87 .41

 Seek/obtain continuing education in
70 relevant professional areas. 4.49 .78

 Maintain overall awareness of social and
 environmental barriers that impact
78 persons with disabilities. 4.40 .79

* [alpha] Cronbach's alpha

Table 2
Knowledge Areas per Factor Importance--Mean, SD, Alpha (N = 550)

 Mean SD [alpha] *

Factor I: Foundations of Vocational Evaluation 3.91 .69 .94

 Development and use of situational and
36 community-based assessment. 3.63 1.21

 Concepts of teaching/training/educating/
 presenting (e.g., teaching or training a
24 client a specific task or program. 3.59 1.09

 Behavioral techniques used in evaluation
 (e.g., reinforcement, modeling,
8 chaining). 3.64 1.09

 Characteristics of work performance and
11 work behavior. 4.11 .88

 General principles of learning and
 learning assessment (e.g., learning
9 styles and learning versus performance). 3.82 1.04

 Modification and accommodation of
10 evaluation techniques. 4.01 1.00

 General concepts of assistive
16 technology. 4.01 .93

22 Assistive technology devices and
 services. 3.98 .95

42 Systematic behavioral observation skills
 and techniques. 4.22 .94

 Concepts of work adjustment and work
14 hardening. 3.83 1.00

 Service delivery systems common to
4 vocational evaluation. 3.89 .98

 Ecological variables that may impact
 employability (e.g., accessibility,
12 attitude of co-workers). 3.91 .96

 Theory and use of work samples and
 commercial vocational evaluation systems
5 (including job matching systems). 3.86 1.14

 Community resources and support
3 programs. 4.16 .89

 Individualized vocational evaluation
7 planning. 4.33 .94

 Employer and workplace needs and
23 standards. 3.92 .95

18 Worker traits. 4.10 .92

 Program evaluation and research (e.g.,
51 outcome, satisfaction). 3.73 1.04

37 Cultural implications of disability. 3.76 1.04

2 General theories of career development
 and vocational decision-making. 3.77 .99

Factor II: Standardized Assessment 4.16 .70 .91

31 Scoring and interpreting standardized
 tests/instruments. 4.45 .89

28 Selection and administration of
 standardized test/instruments. 4.39 .91

30 Statistical concepts related to 4.02 1.00
 reliability, validity and norming of
 standardized tests/instruments

44 Analysis, synthesis and interpretation
 of evaluation results. 4.55 .75

26 Principles of psychological measurement
 (e.g., psychological testing. 4.05 .99

 Legal and ethical uses of standardized
45 tests/instruments. 4.27 .92

 Factors impacting standardized testing
 (e.g., culture, performance anxiety,
48 environment). 4.18 .89

43 Vocational evaluation philosophy and
 process. 3.94 1.02

 Triangulation of evaluation techniques
 (e.g., compare standardized test results
 to behavioral observations and/or work
1 history). 4.08 1.05

 General concepts of multiple and
6 emotional intelligence(s). 3.63 1.01

Factor III: Occupational Information 4.06 .71 .88

21 Transferable skills analysis. 4.34 .85

17 Standardized occupational information
 and classification systems (e.g., DOT,
 GOE, O*NET, NOC [Canada]). 4.24 .95

33 Labor market research and analysis. 4.05 1.00

15 Job analysis and task analysis. 4.07 .96

20 Job development and job placement. 3.94 1.11

 Forensic applications of vocational
 evaluation (e.g., expert testimony
 regarding loss of earning capacity
 and/or vocational rehabilitation
53 feasibility). 3.72 1.23

 Use of web-based resources to obtain
 occupation, education or training
 information e.g., job postings,
 educational programs, wage/salary
25 information). 3.90 1.05

35 Functional skills assessment. 4.29 .82

19 Common benefit systems (e.g., SSI, SSDI,
 Workers' Compensation, CPP [Canada],
 Insurance). 4.02 .98

Factor IV: Implications of Disability 4.16 .68 .86

 Psychological/psychiatric aspects of
38 disability. 4.32 .80

 Pharmacology; impact of medications,
 substance use/abuse/addiction on
32 vocational functioning. 4.07 .86

 Ecological variables that impact
39 vocational functioning. 3.69 1.02

34 Medical aspects of disability. 4.50 .73

40 Psychosocial aspects of disability. 4.24 .83

Factor V: Communication 4.48 .57 .82

 Verbal communication skills to convey
50 information and evaluation results. 4.58 .65

47 Vocational interviewing skills. 4.56 .75

 Individualizing and prioritizing
52 recommendations. 4.33 .81

27 Written communication skills and
 vocational evaluation report
 development. 4.63 .65

41 Vocational counseling techniques and
 skills. 4.34 .89

Factor VI: Professional Networking &
Coordination 3.92 .76 .75

 Basic negotiation and mediation
55 techniques. 3.54 1.13

 Collaboration skills to develop
 effective partnerships within and across
54 disciplines. 3.91 1.03

 Computer literacy and application
46 skills. 4.11 .91

49 Principles of case management (e.g.,
 documentation, case file organization,
 service coordination). 4.13 .97

* [alpha] Cronbach's alpha

Table 3
CVE Relative Importance of Job Functions

Clinical Skills
to Analyze &
Synthesize 4.45

Behavior Observation
& Evaluation Techniques 3.57

Case
Management 3.77

Occupational
Analysis 3.9

Voc
Counseling 4.14

Professionalism 4.57

Note: Table made from bar graph.

Table 4
CVE Relative Importance of Knowledge Domains

Foundation of VE 3.91

Standardized
Assessment 4.16

Occupational
Information 4.06

Implications of
Disability 4.16

Communication 4.48

Professional
Networking &
Coordination 3.92

Note: Table made from bar graph.


References

Coffey, D. (1978). Vocational evaluator competencies and their relevance as perceived by practitioners and educators in vocational evaluation. (Doctoral Dissertation, Auburn University). Dissertation Abstracts International, 39, 3364-3365.

Coffey, D. D., Hansen, (2 M., Menz, F. E., & Coker, C. C. (1978). Vocational evaluator role and function as perceived by practitioners and educators. Menomonie WI: University of Wisconsin-Stout. Research and Training Center, Stout Vocational Rehabilitation Institute.

Dowd, L. (Ed.). (1993). Glossary of terminology for vocational assessment, evaluation and work adjustment. Menomonie, WI: Materials Development Center, Stout Vocational Rehabilitation Institute, University of Wisconsin-Stout

Egerman, K., & Gilbert, J. L. (1969). The work evaluator. Journal of Rehabilitation, 35(3), 12-14.

Emener, B., McFarlane, F., & Parker, R. (1978). Editor's introduction to the NRCA-VEWAA joint issue. Vocational Evaluation and Work Adjustment Bulletin, 11(1), 2-3.

Feinberg, L. B., & McFarlane, F. R. (1979). Setting-based factors in rehabilitation counselor role variability. Journal of Applied Rehabilitation Counseling, 10(2), 95-101.

Gannaway, T. W., & Sink, J. M. (1979). An analysis of competencies for counselors and evaluators. Vocational Evaluation and Work Adjustment Bulletin, 12(3), 3-15.

Hamilton, M. R., McDaniel, R. S., Leconte, P., & Stewart, S. K. (2003). Panel: Advocating for appropriate training of vocational evaluators. Paper presented at the 11th National Forum on Issues in Vocational Evaluation, Assessment, and Work Adjustment, Charleston, SC.

Henderson, J. P. (1996). Job analysis. In A. H. Browning, A. C. Bugbee, & M. A. Mullins (Eds.), Certification: A NOCA handbook (pp. 41-66). Washington, DC: The National Organization for Competency Assurance.

Leahy, M. J. (1986). Competency importance in rehabilitation roles and settings. (Doctoral dissertation. University of Wisconsin-Madison, 1986). Dissertation Abstracts International, 48, 0037.

Leahy, M. J., & Wright, G. N. (1988). Professional competencies of the vocational evaluator. Vocational Evaluation and Work Adjustment Bulletin, 21(4), 127-132.

Leahy, M. J., Chan, F., & Saunders, J. (in press). Job functions and knowledge requirements of certified rehabilitation counselors in the 21st century. Journal of Applied Rehabilitation Counseling.

Leahy, M. J., Shapson, P. R., & Wright, (2 N. (1987). Rehabilitation practitioners competencies by role and setting. Rehabilitation Counseling Bulletin, 31, 119-130.

Linstone, H. A, & Turoff, M. (Eds). (1975). The Delphi method." Techniques and applications. Reading, MA: Addison Wesley.

Nadolsky, J. M. (1971). Patterns of consistency among vocational evaluators. Vocational Evaluation and Work Adjustment Bulletin, 4(4), 13-25.

Newman, I., Waechter, D., Nolte, D., & Boyer-Stephens, A. (1998, Fall/Winter). An assessment of knowledge domains for vocational evaluators: A precursor to a national licensure examination. Vocational Evaluation and Work Adjustment Bulletin, 72-79.

Nugent, F. (1981). Professional counseling: An overview. Pacific Groves, CA: Brooks/Cole.

Pruitt, W.A. (1972). Task analysis of the vocational rehabilitation graduate major with an emphasis in work evaluation: A comparative study of two groups of work evaluators. Menomonie, WI. Graduate College, University of Wisconsin-Stout.

Pruitt, W.A. (1986). Vocational Evaluation (2nd Edition). Menomonie, WI: Walt Pruitt and Associates.

Sankovsky, R. (1969). State of the art in vocational evaluation: Report of a national survey. Pittsburgh, PA: Research and Training Center in Vocational Rehabilitation, University of Pittsburgh.

Shumate, S., Hamilton, M. & Fried, J. (2004). Vocational Evaluation and Career Assessment Professionals Journal, 1(1), 25-39.

Sigmon, G. L., Couch, R. H., & Halpin, G. (1987). A comparison of competency studies in vocational evaluation. Vocational Evaluation and Work Adjustment Bulletin, 20(1), 19-21.

Sink, J. M., & Porter, T. L. (1978). Convergence and divergence in rehabilitation counseling and vocational evaluation. Journal of Applied Rehabilitation Counseling, 9(1), 5-20.

Sink, J. M., Porter, T. L., Rubin, S. E., & Painter, L. C. (1979). Competencies related to the work of the rehabilitation counselor and vocational evaluator, Vol. 1. Athens, GA: University of Georgia.

Spieser, A. (1967). The role of the evaluator in a sheltered workshop. In Stout State University, Vocational evaluation curriculum development workshop. Menomonie, WI: Author.

Tabachnick, B. G., & Fidell, L, S. (2001). Using multivariate statistics. Needham Heights, MA: Allyn & Bacon.

Taylor, D. W., & Bordieri, J. E. (1993). Vocational evaluators' job tasks and functions: A national study. Report to the Commission on Certification of Work Adjustment and Vocational Evaluation Specialists. Carbondale, IL: Rehabilitation Institute.

Taylor, D. W., Bordieri, J. E., Crimando, W., & Janikowski, T. P. (1993, Summer). Job tasks and functions of vocational evaluators in three sectors of practice. Vocational Evaluation and Work Adjustment Bulletin, 39-46.

Thomas, S. W. (1996, Spring). Position paper supporting the continued funding of vocational evaluation training by the Rehabilitation Services Administration. Vocational Evaluation and Work Adjustment Bulletin, 4-8.

Thomas, S.W. (1999, January/February/March). Vocational evaluation in the 21st century: Diversification and independence. Journal of Rehabilitation, 10-15.

Vocational Evaluation and Work Adjustment Association. (1975). Vocational evaluation project: Final report [Special issue]. Vocational Evaluation and Work Adjustment Bulletin, 8.

Michelle Hamilton

University of Wisconsin--Stout

Stephen Shumate

University of Wisconsin--Stout

Michelle Hamilton, Ph.D., Rehabilitation and Counseling Department, 227 Vocational Rehabilitation Building, University of Wisconsin--Stout, Menomonie, Wisconsin 54751 E-mail: HamiltonMi@uwstout.edu
COPYRIGHT 2005 National Rehabilitation Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2005, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Shumate, Stephen
Publication:The Journal of Rehabilitation
Geographic Code:100NA
Date:Jan 1, 2005
Words:9712
Previous Article:Editor's comment.
Next Article:Exploring perspectives of individuals with disabilities on stress-coping.
Topics:


Related Articles
Determining vocational levels of people with disabilities in Japan: A statistical prediction approach.
Supported employment and vocational rehabilitation: merger or misadventure?
Work satisfactoriness of former clients with severe handicaps to employment.
Mild brain injury: critical factors in vocational rehabilitation.
Integrating qualified workers with disabilities into the workforce.
The growth and structure of the proprietary rehabilitation sector.
The role of the entrepreneur in job placement.
Consumer Satisfaction with Vocational Rehabilitation Services.
An Individualized Job Engagement Approach for Persons with Severe Mental Illness.
Strengths and challenges of intervention research in vocational rehabilitation: an illustration of agency-university collaboration.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters