Printer Friendly

Upgrading knowledge of vocational evaluators: a report of one state's efforts.

The need for practitioners in rehabilitation services to stay current in knowledge and skills of their respective crafts fields has been and continues to be advocated by researchers and educators. In a special issue of Rehabilitation Education focusing on continuing education for rehabilitation personnel, McFarlane (1999) wrote that since 1974, there has been an increased demand and expectation for continuing education from many sectors of the rehabilitation community who realize how closely the quality of services are tied to current knowledge and skills (p. 1). Closely allied with acquisition of essential attitudes, knowledge and skills is the task of improving and implementing collaborative efforts among public rehabilitation agencies and universities to impact research and public policy (1999).

Writing in the same issue, Amick and Wesly (1999) addressed the need for collaborative efforts among agency and universities. The authors outlined strategies and approaches for implementing a continuing education program for the adult learner or non-traditional student. If rehabilitation agency personnel are to remain effective, the authors argued, these same individuals must continually acquire new skills, techniques and methodologies for job performance (p. 25). Yet, barriers may exist that prevent effective delivery of continuing education which, for purposes of this article, is meant to include the broader array of traditional instructor-student format of in-service trainings, workshops, distance learning, and classroom lectures.

Amick and Wesley identified four obstacles to learning that should be addressed when designing continuing education for the adult learner. First, university-agency partnerships need to be reexamined and reworked to be effective in the face of limited resources. Second, is to understand that adult learner differ from traditional university learners in motivation and experience. Third, it is important that faculty use a variety of learning strategies including class discussions, collaborative learning and group projects, peer teaching, independent learning, role playing and case studies to "customize" the learning experience. Finally, technology is critical to reaching individual learners and delivering continuing education programs. Distance learning, teleconferencing and videoconferencing, can outreach continuing education as well as in-service training and workshops to individuals and areas that may not otherwise have access to it.

In responding to the question of what is to be learned, it may be seen that much has been written in the literature about the role, competencies and knowledge domains of vocational evaluation for pre-service education programs and as practiced in public and private sectors (Berven & Wright, 1987; Dew, Garcia, & Forrester, 1999; Elkredge Fried & Grissom, 1991; Hamilton, 2003; Leahy & Wright, 1988; Newman, Waechter, Nolte, & Boyer-Stephens, 1998; Rubin & Porter, 1979; Taylor, Bordieri, Crimando & Janikowski, 1993; Taylor, Bordieri, & Lee, 1993; Wesolek & McFarlane, 1992). However, as Taylor et al (1993) conclude from their survey results of job functions of vocational evaluators, many of the respondents they surveyed claimed to have little formal training in vocational evaluation. The question then is, how to upgrade knowledge of vocational evaluation for practitioners. The work of Malcolm Knowles (1980, 1984) offers one method for addressing ongoing learning over the span of one's career.

Knowles' model for teaching adults that adapts traditional instructor-centered (pedagogy) methods to the learner no longer in the academic setting. Knowles's concept is referred to as learner-centered or andragogy and assumes that adult learners are self learners, come to the learning environment with a diversity of experiences, generally attach a more practical aspect to learning (e.g., problem solving), and are motivated for learning by internal or intrinsic factors rather than external extrinsic factors (Imel, 1989).

The endeavor described here resulted from one state's public sector rehabilitation agency's collaboration with a university to upgrade the knowledge and effectiveness of vocational evaluators. The vocational evaluators are employed in 19 vocational training centers monitored and staffed by the Tennessee Division of Rehabilitation Services. The focus of this collaboration was to address gaps in professional knowledge that occur when personnel do not possess the preferred academic credentials and practical experience.

An Agency-University Collaboration

The Training and Technical Assistance Project (TTAP) is a state sponsored program funded totally by the Tennessee Division of Rehabilitation Services (TNDRS). The Project has been in existence for twenty-plus years and is designed to provide ongoing university-based staff development and training materials and products to staff in vocational training centers operating under the auspices of TNDRS. An additional function of the Project is to answer questions from field staff and TNDRS about various tests and offer recommendations concerning selection and use of testing instruments.

The system of vocational training centers consists of Tennessee Rehabilitation Center (TRC) at Smyrna, a residential and vocational training center located in Smyrna, Tennessee, and 18 smaller centers situated across the state to serve surrounding counties. The centers offer time-limited rehabilitation services (i.e., work adjustment, vocational evaluation). Staff include a manager, secretary, vocational evaluator, and one or two rehabilitation assistants (work adjustment specialists). The TRCs offer rehabilitation services to clients referred by Vocational Rehabilitation Counselors (VRCs).

TTAP is located at the University of Memphis and is housed in the Center for Rehabilitation and Employment Research (CRER), a component of the College of Education (COE). The Center consists of a Vocational Evaluation Lab (VE Lab) and Assistive Technology Center (ATC). Staff include two vocational evaluators in the VE Lab and a Director and Rehabilitation Engineer in the ATC. Individuals with disabilities are referred by local DRS counselors for vocational evaluation and/or assistive technology assessments. Faculty from the Counseling Psychology graduate program provide an added dimension by overseeing and training graduate assistants in that program to conduct psychological assessments.

Method

Decisions about what kinds of training should be offered annually to vocational evaluators are derived from site visits, survey and telephone and email contacts. However, familiarization with academic and experience background of new and experienced employees may suggest that knowledge and skill was not equivalent for all vocational evaluators. Recognizing the diversity of training needs with personnel, a multiple-choice examination was designed to obtain a comparative measure based on the six knowledge domains identified by CCWAVES as important to the practice of vocational evaluation.

The Instrument

The current examination was designed to approximate the knowledge domains espoused by the Commission on Certification of Work Adjustment and Vocational Evaluation Specialists (CCWAVES, 2004). The references cited by CCWAVES in the organization's Standards and Procedures Manual provided the references for writing the examination questions. The following list is a comparison of CCWAVES's knowledge domains to those designated on the TTAP examination. In comparing the relatively longer listing for the TTAP Examination domains, the reader should note that each CCWAVES knowledge domain consists of between seven and sixteen sub-domains. The CCWAVES domains and sub-domains were modified to more accurately reflect the tasks performed by TRC Vocational Evaluators.

Our goal was to generate a representative number of items under each knowledge domain. The decision as to the number of examination items to include under each knowledge domain was based on the relative weight in importance to job performance given each knowledge domain by TNDRS and Project staff. Thus, Standardized Testing (20 questions) was assumed to have more relevance to daily job performance for the vocational evaluators then Situational and Community-Based Assessment (5 questions) in which few evaluators were trained or experienced. In the list referred to, the number of items or questions for each TTAP knowledge domain is given in parenthesis.

CCWAVES Knowledge.

Domains.

1. Principles of Vocational Evaluation

2. Standardized Assessment

3. Occupational Information

4. Implications of Disability

5. Professional Communication

6. Professional Enhancement

TTAP Knowledge Domains

1. Philosophy and Process (5 questions)

2. Job Analysis (5 questions)

3. Occupational Information (5 questions)

4. Functional Aspects of Disability (5 questions)

5. Vocational Interviewing (5 questions)

6. Individual Vocational Evaluation Planning (5 questions)

7. Standardized Testing (20 questions)

8. Work Samples and Systems (10 questions)

9. Situational and Community Based Assessment (5 questions)

10. Behavioral Observation (10 questions)

11. Assessment of Learning (5 questions)

12. Functional Skills Assessment (5 questions)

13. Vocational Evaluation Report Development and Communication (10 questions)

14. Modifications and Accommodation (5 questions)

The current examination was administered during a TNDRS sponsored workshop for vocational evaluators scheduled in April, 2000. A total of 23 vocational evaluators participated in the workshop conducted by two faculty members of the graduate program in Rehabilitation Counseling at The University of Memphis. The one-day six-hour workshop topic agenda consisted of a half-day overview of standardized testing (e.g., validity, reliability, standard scores, etc.) followed by the examination and a question and answer period. The results of the current examination were then to be used to identify the knowledge domains for which participants would benefit from additional training.

Participants'

Participants in the upgrade training were Vocational Evaluators employed by the Tennessee Division of Rehabilitation Services (TNDRS). The 23 evaluators participating in the training are located in rural vocational training centers across the state and a single vocational-residential center in Smyrna. Prior to taking the examination, a demographic form was completed by the evaluators who were asked to indicate their academic degree(s) and majors, and years of experience as vocational evaluators. A simple frequency count of these categories indicated that ten (43%) had earned a Bachelors degree in Rehabilitation or a related field and four (17%) had earned a Masters degree in Rehabilitation or a related field (e.g., Psychology). Of the non-related academic degrees, eight (35%) had earned a Bachelors degree in such diverse disciplines as Public Management and Geography. Of those who had earned a Masters degree, one (5%) was non-related. Of the 23 participants, sixteen or 70% reported four years or less of experience as a vocational evaluator, four or 17% indicated they had less than one year of experience although six (26%) had ten years or more experience.

Results of First Administration

The examinations were scored and knowledge domains with highest percentage of errors were prioritized in descending order. Data were analyzed by individual, regionally, and state-wide for use by training officers in the TNDRS and to develop individual training plans by Project staff. The highest percentage of errors occurred in the knowledge domains for:
Assessment of Learning 62%
Situational/Community-Based Assessment 45%
Vocational Evaluation Report Development
and Communication 45%
Functional Skills Assessment 44%
Individual Vocational Evaluation Planning 43%


The percentage of errors given in the foregoing reflects the total errors made for all participants in that knowledge domain. Thus, the largest number of wrong answers for all 23 participants came from Assessment of Learning (62%) followed by Situational/Community-Based Assessment (45%) and so on.

The percentage of items missed was consistent across the three regions of the state (33%, 35%, and 37% for West, Middle, and East respectively). The range of scores was from 50 to 74 with 11 out of 23 (48%) evaluators scoring at 70% and above (the cut-off score used to determine whether or not an individual had successfully passed the examination). Of the 23 participants, 11 (48%) achieved a score of 70% or higher.

Individual Training Plans

The sole purpose of the Project examination was to assess general knowledge of the vocational evaluators in the state's vocational training center system. Since it was not uncommon for TNDRS to employ personnel with academic and experience unrelated to Rehabilitation and Vocational Evaluation, a method was sought by which to measure aggregate and individual knowledge about vocational evaluation. This would be important to grounding newly employed evaluators in foundations and techniques of assessment as well as familiarization with selecting, administering, scoring and interpreting testing instruments. Once it was determined in what general areas individual evaluators needed further information and practice, individual training plans could be established and implemented through a variety of learning methods including site visits, homework assignments, and training materials. Table 1 shows the various learning activities associated with each knowledge domain. These activities were also submitted concurrently to TNDRS for that agency's approval before initiating implementing training by Project staff.

The examination results were shared with each evaluator so he or she was aware of total number of items answered correctly as well as total number within each knowledge domain. Results of the examination were conveyed to each vocational evaluator by formal letter showing total score and scores for each knowledge domain. Also enclosed with the letter was the list of learning activities (See Table l) for each domain in which the evaluator missed more than 60% of the items. Evaluators were given oneyear to complete the written assignments, encouraged to email or call Project staff with any questions and were given feedback on the results of each completed activity. When all learning activities were completed, Project staff signed off on the learning plan kept in each individual's training file.

The methods used for implement individual training were dependent upon the nature of the material to be learned or skill to be demonstrated. The following specifies means by which training was instigated with examples.

[check] Write a response to a specific problem. This was generally the preferred method. Example: List the testing instruments that may be administered to a person with mental retardation.

[check] Demonstrate ability to perform a skill. Example: This type of response was usually associated with administration, scoring and interpreting a test, writing the vocational evaluation report, or developing an evaluation plan. Since observation of the skill was necessary, site visits were used for this purpose.

[check] In-service training/workshops. Example: Advanced training on the McCarron-Dial Evaluation System and VALPAR's Pro3000 software were arranged for all 23 vocational evaluators to attend.

Results of Second Administration of the Examination

Training was completed at the end of one year. Due to the relatively small number of individuals involved and the use of email and workshops, it was possible to accomplish each evaluator's training plan. In May, 2002, a second examination was administered during a two-day in-service training program at TRC-Smyrna. This in-service was a third and final training involving guest presenters with expertise in various aspects of vocational evaluation (e.g., test development, administration and interpretation of personality measures, occupational information sources). During the year, one participant in a vocational training center resigned. Although another individual was hired and participated in the second examination in May, 2001, the scores from the center could not be compared and therefore was not included for comparative purposes. A total of 23 vocational evaluators participated in the training and examinations.

Results of the second Project examination indicated that all but four of the vocational evaluator participants scored at 70% or higher, a 25% increase. For the first administration of the examination, 12 of 23 (52%) scored 70% or higher and for the second administration, a total of 17 of 23 (74%) scored at 70% or higher. For the four individuals who score below 70%, additional written assignments were developed to address the deficit items. The assignments consisted of providing copies of selected materials for self-study. Individuals were given 90 days to complete the assignments which consisted of answering several questions specific to the deficit areas and re-submitting the answer sheets to Project staff for grading. Based on results of the final round of testing, all four individuals successfully mastered the materials and achieved an overall score of at least 70%. Table 2 shows test scores for the 23 participants. Certificates of Achievement were then prepared, signed by the Dean of the College of Education at the University of Memphis and the Project's Director and mailed to each participant.

Outcomes of Training

There were several worthwhile outcomes affecting the knowledge and performance of vocational evaluators employed by TNDRS. The results of the year long test and train focus influenced state level staff development efforts with other university and agency entities as well. As of the writing of this article, TNDRS and Project staff could identify several consequences of training activities:

[check] Vocational evaluators increased awareness of and information about essential components of assessment and evaluation. The relative weight allotted to learning the techniques of testing, occupational information, and the vocational impact of disability tended to overshadow other aspects of vocational evaluation. Assistive technology, job analysis techniques and procedures, and the technical elements of test standardization, for example, were often overlooked.

[check] Five vocational evaluators who were completing master's degrees in Rehabilitation Counseling through distance learning programs stated that the experience of taking the Project's examination and subsequent training on specific topics were helpful during comprehensive examinations.

[check] A new evaluator training program was established that customizes training based on the individual's academic training and experience. Basically, the new employee to vocational evaluation enters the following regimen over the course of six months:

* One to five days with experienced vocational evaluators in nearby training centers for orientation and introduction to vocational evaluation procedures

* One week with vocational evaluators at TRC Smyrna for practice administration and testing experience with client/customers

* One to five days with the Project's Vocational Evaluation unit to focus on additional testing instrument and procedures including psychological assessments, assistive technology evaluations and familiarization with various testing instruments that may not be available in the training centers. The evaluator trainee is provided an agenda designed to address specific deficit areas. Additional time is given over to observation and hands-on experience with testing instruments and clients of the Center.

check] An individual training plan is developed based upon the results of the Project examination following six months of experience. If there are no significant deficit areas (i.e., overall score is 70% or better), no further action need be taken unless at the employee's request (e.g., site visit, information about specific testing products or materials).

[check] Sole reliance on needs surveys asking vocational evaluators to identify and prioritize unmet training needs was replaced by the Project examination which offered a clearer picture of that individual's knowledge across various domains

[check] Mangers of vocational training centers including TRC Smyrna increased awareness of preferable qualifications, credentials and skills when interviewing applicants for vocational evaluator positions in their respective facilities

[check] Training materials, resource and reference documents were written and placed in each training center. Examples of these products are monthly fact sheets detailing aspects of a testing instrument, summaries of journal articles pertinent to trends in vocational evaluation and reviews of new and established testing instruments.

Conclusion

The training initiative described in this paper, may be viewed as an informal method or variation of the larger issues of mandatory continuing education (MCE). As Kerka (1994) pointed out, "being a professional implies commitment to one's education and the ability to pursue practice-enhancing learning" which essentially negates the need for mandates. While there are decided pros and cons to the debate about MCE, Kerka reminds us of Nelson's (1988) point that program design an delivery should emphasize consultation and cooperation, not coercion, particularly where standards exist for comparative purposes. Several factors evolved during the course of this project that aided in successfully imparting practice-specific knowledge to vocational evaluators lacking formal training in that field.

1. Recognition that collaborative efforts among public rehabilitation and universities such as that described in this paper are critical to establishing and maintaining quality vocational evaluation services.

2. Synthesis of institutional resources to bring practitioners and educators together.

3. The importance of setting clear objectives that everyone involved understands and agrees is indispensable.

4. A vehicle for measuring where knowledge gaps exist and a set of standards for comparison.

5. Familiarization with characteristics of adult learners and support for alternate methods of delivering practice-specific instruction and continuing education

6. Commitment to long-term learning

7. Methods for measuring effectiveness of learning

As the reader may be aware, the factors listed here are idiosyncratic to the participants and state in which they work. However, the authors believe that similar applications may be replicated in other states wishing to upgrade and/or strengthen the knowledge of personnel with academic credentials and experience in fields other than rehabilitation. Further, job-related performance assessments of this nature, while not subjected to tests of validity and reliability are, nevertheless, useful tools in identifying knowledge common to vocational evaluation.

Acknowledgement Funding for this project is made possible by a grant from the Tennessee Division of Rehabilitation Services, grant #5-34734 and is administered through the University of Memphis.

References

Amick, S., & Wesley, M. E. (1999). Educating rehabilitation practitioners: Obstacles and opportunities. Rehabilitation Education, 13 (1), 25-36.

Berven, N., & Wright, G. (Eds.). (1987). Research on professional rehabilitation competencies [Special Issue]. Rehabilitation Counseling Bulletin, 31(2).

Commission on Certification of Work Adjustment and Vocational Evaluation Specialists. (2004). Standards and procedures manual for certification in vocational evaluation. Roiling Meadows, IL: Author.

Commission for Certification of Work Adjustment and Vocational Evaluation Specialists (2001). Retrieved from www.ccwaves .org/other/univers.html January 3, 2001.

Dew, D., Garcia, J., & Forrester, L. (1999). Implications of the comprehensive system of personnel development (CSPD) for rehabilitation continuing education. Rehabilitation Education, 13(1), 15-23.

Eldredge, G., Fried, J., & Grissom, J. (1991). Vocational evaluator training needs: Food for thought. Vocational Evaluation and Work Adjustment Bulletin, 24(1), 11-13.

Hamilton, M. (2003, October). Role and Function of Certified Vocational Evaluation Specialists. Paper presented at the meeting of the National Rehabilitation Association Annual Training Conference, Nashville, TN.

Imel, S. (1989). Teaching adults: Is it different? (ERIC Document Reproduction Service No. ED305495). Retrieved November 12, 2004 from http://www.ericfacility.net/eric digests/ed305495.html.

Kerka, S. (1994). Mandatory continuing education. (ERIC Document Reproduction Service No. ED376275). Retrieved November 15, 2004 from http://www.ericfacility.net/eric digests/ed376275.html

Knowles, M. S. (1980). The modern practice of adult education (Rev. ed.). Chicago: Follett Publishing Company.

Knowles, M. S. (1984). Introduction: The art and science of helping adults learn. In M. S. Knowles and Associates (Eds.). Andragogy in action: Applying modern principles of adult learning. San Francisco: Jossey-Bass.

Leahy, M., & Wright, G. (1988). Professional competencies of the vocational evaluator. Vocational Evaluation and Work Adjustment Bulletin, 21(4), 127-132.

McFarland, F. R. (1999). The expanded importance and expectations for lifelong learning and continuing education in rehabilitation. Rehabilitation Education, 13 (1), 3-12.

Nelson, J. W. (1988). Design and delivery of programs under mandatory continuing professional education. Studies in continuing education 10, No. 2. (ERIC Document Reproduction Service No. ED384843). Retrieved November 12, 2004 from http://www.ericfacility.net/ericdigests/ ed384843.html.

Newman, I., Waechter, D., Nolte, D., and Boyer-Stephens, A. (1998). An assessment of knowledge domains for vocational evaluators: A precursor to a national licensure examination. Vocational Evaluation and Work Adjustment Journal, 31(3/4), 72-79.

Rubin, S., & Porter, T. (1979). Rehabilitation counselor and vocational evaluator competencies. Journal of Rehabilitation, (45), 42-45.

Taylor, D., Bordieri, J., Crimando, W., & Janikoski, Y. (1993). Job tasks and functions of vocational evaluators in three sectors of practice. Vocational Evaluation and Work Adjustment Bulletin, 26(2), 39-46.

Taylor, D., Bordieri, J., & Lee, D. (1993). Job tasks and functions of vocational evaluators: A national study. Vocational Evaluation and Work Adjustment Bulletin, 26(4), 146-154.

Wesolek, J., & McFarlane, F. (1992). Vocational assessment and evaluation: Some observations from the past and anticipation for the future. Vocational Evaluation and Work Adjustment Bulletin, 25(2), 51-54.

David F. Roberts

University of Memphis

Ruth J. Roberts

University of Tennessee-Memphis

Dr. David F. Roberts, Ph.D., CRC, CVE, Center for Rehabilitation and Employment Research, University of Memphis, 119 Patterson Hall, Memphis, TN 38152. Email: rroberts@memphis.edu
Table 1.
Learning Activities and Knowledge Domains

 Knowledge Domain Learning Activity

1. Philosophy and Process a. Describe common methods of
 evaluation and assessment;
 include advantages and
 disadvantages of each method
 b. Write your philosophy of
 vocational evaluation

2. Job Analysis a. Identify the type of
 information (worker
 characteristics) one would
 gather when conducing a job
 analysis
 b. Develop a job analysis for
 a sub-contract in your center

3. Occupational Information a. List and briefly define the
 primary components/worker
 traits given in the D.O.T.
 b. Specify sources of labor
 market information
 c. Describe how labor market
 information may be used in
 vocational evaluation

4. Functional Aspects a. Describe the purpose and use
 of Disability of an assessment of functional
 abilities
 b. What are the primary traits
 that should be addressed in
 the functional assessment

5. Vocational Interviewing a. What is the purpose of
 conducting the initial
 interview
 b. Define the term
 "transferability of skills"
 and how it is important to
 the vocational interview

6. Individualized Vocational a. Identify the essential
 Evaluation Planning components of an evaluation
 plan
 b. Briefly describe the process
 of developing the evaluation
 plan
 c. What is the purpose of an
 evaluation plan

7. Standardized Testing a. Describe how you would
 determine the scope and
 duration of a vocational
 evaluation for a person
 with (a) Mental Retardation,
 (b) Spinal Cord Injury,
 (c) Traumatic Brain Injury,
 and (d) Bi-Polar Disorder
 b. What is the difference between
 reliability and validity
 c. List at least three sources
 of test information

8. Work Samples and Systems a. What are the advantages
 of commercial and locally
 developed work samples
 b. What are primary considerations
 for selecting and using one or
 more components of a work
 sample system such as VALPAR
 or VITAS

9. Situational and Community a. Describe the purpose and use
 of community-based Based
 Assessment assessments
 b. What are important
 considerations when observing
 and recording behavior during
 an evaluation

10. Behavioral Observation a. Identify various methods
 and recording behaviors for observing and recording
 behavior
 b. What are sources of error
 that may occur when observing

11. Assessment of Learning a. What are at least three ways
 that an individual learns
 b. What recommendations would
 you make for someone who
 is planning on entering a
 vocational school and who
 is diagnosed with a learning
 disability in reading

12. Functional Skills Assessment a. Describe the purpose and use
 of functional skills assessment

13. Vocational Evaluation Report a. What are the primary
 Development and Communication considerations the evaluator
 must make when structuring
 and organizing a vocational
 evaluation report
 b. What is the client's role in
 developing the written report
 c. What are three critical
 skills the evaluator should
 demonstrate when writing the
 vocational evaluation report

14. Modifications and a. List the testing modifications
 Accommodations and job modifications you would
 recommend for a person with
 (a) Mental Retardation,
 (b) Spinal Cord Injury,
 and (c) Hearing Impairment

Table 2
Comparison of Vocational Evaluator Test Scores--100 items

 September 2001
Subject April 2000 Test May 2001 Test Test

1 62 72
2 65 82
3 60 71
4 71 80
 72 * 80 *
5 59 57 78
6 58 72
7 72 73
8 62 76
9 62 64 79
10 72 85
11 70 82
12 52 66 82
13 51 74
14 70 83
15 67 76
16 70 68 80
17 64 77
18 72 77
19 71 85
20 74 81
21 65 77
22 70 85
 Mean = 65.4 Mean = 75.16
 sd = 6.6 sd = 7.4

* Test scores are shown for two individuals employed
in this position during the testing and training phases.
These scores also were omitted in calculating the mean
for first and second test administration
COPYRIGHT 2005 National Rehabilitation Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2005, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Roberts, Ruth J.
Publication:The Journal of Rehabilitation
Date:Oct 1, 2005
Words:4465
Previous Article:A study of the relationship of the characteristics of injured workers receiving vocational rehabilitation services and their depression levels.
Next Article:Key factors related to vocational outcome: trends for six disability groups.
Topics:


Related Articles
Careers in rehabilitation: an introduction to this special issue of American Rehabilitation.
Vocational evaluation as a career.
The Vocational Evaluation and Work Adjustment Association.
Providing outreach and rehabilitation counseling services to non-English speaking persons.
Vocational barriers encountered by college students with learning disabilities.
Vocational Evaluation in the 21st Century: Diversification and Independence.
Vocational rehabilitation counselor perceptions of the General Educational Development test. (Counselor Perceptions).
The role and function of certified vocational evaluation specialists.
An investigation of the job tasks and functions of providers of job placement activities.

Terms of use | Copyright © 2017 Farlex, Inc. | Feedback | For webmasters