Developing a skills checkup.
How do you know your employees are learning everything they ought to? That's a good question. And when a federal regulatory agency is doing the asking, it's especially important to know the answer.
For one member of Robert Morris Associates (RMA), Philadelphia, an association of bank loan and credit officers, that question arose during a routine inspection by the Office of the Controller of the Currency. Along with the Federal Reserve and various state agencies, the OCC regulates banking operations. Given the bank failures of the 1980s, which often stemmed from poor credit quality and problem loans, the agencies want assurance that the people handling loans and credit write-ups can do their jobs correctly.
The OCC inspectors had raised a valid concern for all of RMA's 15,000 members: How good were the training programs attended by their employees? In other words, how did that training translate into on-the-job knowledge and effectiveness? To help its members respond to those questions, RMA developed the Diagnostic Assessment of Lending Skills and Knowledge.
Unlike a certification process, which documents that people have already mastered the knowledge required by an industry or profession, the assessment process points out what knowledge people still need to acquire. RMA's diagnostic assessment - a collection of multiple-choice questions grouped by content area - provides specific and objective feedback about a person's skill levels. Our members can use the results to develop an individualized training plan for each employee.
Here are some ways that our members benefit from using the assessment:
* Greater profitability and productivity. A well-trained work force is a key factor in corporate profitability. Knowledgeable employees can make sound lending decisions which, in turn, keep the banking institution financially healthy.
In addition, organizations often train all their employees when only some really need the training. The assessment pinpoints how much - or how little - instruction an employee needs in specific areas. The assessment also helps set training priorities so that banks use their time and money most effectively.
* Justification for training budgets. Some RMA members - particularly those in smaller organizations - can't afford to send people to training for training's sake. The assessment allows them to target their efforts by determining which professional development courses will most benefit a particular employee.
If administered after an employee has attended a course, the assessment offers feedback on how useful the training was. In that way, banks can see their return on investment. They don't have to guess whether their training programs are working - they'll know.
* Employee development and motivation. Once their individual training needs have been identified, employees become more motivated to learn - they have a quantifiable goal. Employees can also use results of the assessment as a guide for career development.
From concept to product
Start to finish, RMA's efforts to develop a diagnostic assessment took less than a year. We followed these steps:
1. Establish objectives. RMA's training diagnostic is "generic," meaning that we offer the same version to all member banks. The target audience is the lender with three to five years of experience.
Depending on its needs, a bank can use the tool to identify gaps between what the job requires and what the person knows; show how well employees measure up to criteria set by a national panel; and evaluate the training it currently provides, whether internally or through external sources.
2. Determine the scope and content. With these objectives in mind, RMA assembled a task force of 12 people who have significant experience in the commercial lending and credit industry. These content experts identified seven competencies that every lender should possess, such as knowledge of financial accounting, cash-flow analysis, loan structuring and pricing, and detection of problem loans.
3. Call in the experts. Although RMA could supply the content expertise, we didn't have experience in developing a reliable, or statistically valid, tool. As a result, RMA contracted with Educational Testing Service (ETS), Princeton, New Jersey, an independent testing organization specializing in educational and measurement research.
4. Write the questions. Once the task team had identified the seven knowledge and skills areas, RMA recruited 15 senior-level bankers to write questions in each area. ETS conducted a workshop on writing questions before the bankers began their task and continued to provide editorial comments and suggestions throughout the process. For example, ETS recommended using multiple-choice questions with four-answer options. That way, if an employee guessed the answer to every question, he or she could expect to have correct answers on 25 percent of the questions.
ETS also taught the question-writing team about cognitive level - the type of thinking an individual must do to answer a particular question. The three cognitive levels are knowledge (for example, recalling a fact), application (for example, using a mathematical formula), and higher-level thinking (analyzing a situation and determining the most appropriate action). Ideally, an assessment will have a mix of all three cognitive levels.
The task force reviewed the proposed questions for clarity and appropriateness, verified the content and cognitive classifications, and ensured that the questions were linked to seven competencies. After individually commenting on a draft version, the task force reconvened for a final review and to agree on the correct answers.
5. Conduct pilot testing. The association task force's thorough review ensured the assessment had content validity. Still, we needed to determine the audience and define the conditions by which the assessment tool would be used. For the pilot testing, ETS recommended a sample of at least 500 people to provide enough data for conducting statistical tests.
More than 800 people from six geographically diverse banks participated in the pilot assessment. In several instances, managers took the assessment along with their employees (a practice we encourage to demonstrate a bank's commitment to the process). ETS scored the assessments and provided a statistical analysis of the results. Of the 200 questions asked, seven questions were flawed - they proved confusing or lacked a focus. We discarded those questions.
RMA took both statistical results and participants' comments into consideration when developing the final instrument. For example, we set the assessment period at four hours and reduced the number of questions to 165; each competency has between 20 and 35 questions. We resented the extra questions for later use.
Keep in mind that both the quantity and the quality of questions influence the reliability of the assessment. The question to ask is: If a person took the same assessment twice, with no interim training, would the results be essentially the same?
6. Set passing scores. One way you can prevent misuse of diagnostic assessment results is to use general labels such as passing, acceptable, or satisfactory rather than raw scores or percentages. The ETS project manager guided RMA's task force through a standard-setting study in which members identified the amount of knowledge that participants would need to not have a training need. The task force also estimated the difficulty of each module. ETS then analyzed this information and identified a recommended passing point in each content area.
People who take the assessment, however, don't receive a numerical score. Instead, the results indicate levels of performance for each of the seven content areas. Acceptable indicates demonstrated competence, Training Need 1 points to a moderate need for training, and Training Need 2 indicates a strong need for training in that area.
7. Develop supporting materials. To accompany the question booklet and answer sheet, RMA developed a participant's guide and an administrative guide. The first describes the assessment, includes sample questions, and explains how results will be reported. It also calms any fears on the part of the participant, emphasizing that "The RMA Diagnostic Assessment is not a test, and you are not expected to study for it in advance."
The 56-page administrative guide covers similar topics in greater detail and also offers suggestions for the person administering the assessment. For example, we recommend greeting people at the door and include a script that administrators can choose to read word for word. (Much of the information in the guide seems elementary, but in doing the pilot testing, we discovered that administrators often forgot basic necessities - such as allowing a bathroom break during the four-hour assessment.)
8. Develop additional versions. If more than one version were available, RMA members could use the assessments as post-training devices to assess an employee's progress. That benefit prompted us to recruit 23 volunteers for a marathon question-writing session. In just one weekend, they wrote 500 new questions which, when added to the extra questions from the pilot assessment, became the basis for two additional versions.
Since RMA introduced its training diagnostic in September 1992, more than 12,000 people have participated in the assessment process. The member price is $100 per assessment; discounts apply for those who purchase larger quantities. The price includes the questions, answer sheet, participant's guide, and scoring and reporting of results by ETS.
ETS sends the results directly to the participants and to the bank's training office. RMA never sees the results, ensuring that no confidentiality issues can arise among our members.
Selling the assessments may lead to additional income in other association areas. For example, by using the diagnostic tool, bank managers have evidence of a specific training need. They can then justify sending their employees to an association-sponsored course or seminar. Although our members can choose any training provider, a number have returned to RMA to receive their professional development courses and materials.
In addition to the three versions available to U.S. banks, RMA recently finished an assessment for the Canadian market. Working with members from five Canadian banks, we started with the existing competencies and questions and then formulated a new assessment that reflects the country's lending laws and regulations.
Words of caution
If you decide that developing a training needs assessment would be beneficial for your members, be sure you properly position the product. A diagnostic assessment identifies training and development needs. However, it does not forecast or predict employee performance.
For example, RMA designed its assessment to help banks establish a baseline of training levels for their staffs. It wasn't designed to help banks make hiring, firing, or promotional decisions. (If intended for such personnel decisions, an assessment would require more development time, different questions, and measurements of performance and behavior - not to mention an exceedingly detailed statistical analysis.)
RMA specifically prohibits use of its diagnostic assessment for any purpose other than determining training needs. To use a diagnostic assessment for hiring, firing, promotion, retention, or job-placement decisions puts unwarranted stress on the assessment and could result in significant legal problems.
Whether you decide to purchase an assessment from an outside vendor or create it in-house, you'll provide a valuable service for your members. Although RMA's experience is with the heavily regulated banking industry, other industries and professions can certainly find value in focusing on critical needs and spending training dollars more wisely. If your members' employees acquire the skills to become better at their jobs, and if managers reinforce those skills and behaviors, your members will benefit financially.
RELATED ARTICLE: Finding the Flaw
Conducting a pilot test enables you to identify assessment questions that have the potential to confuse people. Here's an example of a flawed question:
A cash manager has a bill of $100,000 due in 45 days. She wishes to provide the funds for this bill by buying a security today that matures on the 45th day. If she can obtain a yield of 8 percent, how much should she invest today? (Assume a 365-day year and a simple interest formula.)
(A) $ 99,023 (B) $ 98,522 (c) $101,899 (D) $ 95,363
Problems: Answer C is not plausible, because you generally don't plan to lose money on an investment. Numerical options should be listed in descending order. Gender references should be left out of questions whenever possible.
Kenneth E. Shipley, Jr., is director of professional development for Robert Morris Associates, Philadelphia, and a member of the ASAE Education Section. Susan Thomas is measurement statistician for the Educational Testing Service, Princeton, New Jersey.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||includes related article|
|Date:||Feb 1, 1995|
|Previous Article:||How to choose a meeting city.|
|Next Article:||Docking exempt employees for part-day absences.|