The PSI score card: determining when to close a CTE Program.[ILLUSTRATION OMITTED]
HOW DO YOU KNOW WHEN IT IS TIME TO EITHER GIVE A PROGRAM EXTRA ATTENTION OR PERHAPS CLOSE IT? That problem faces many career and technical education (CTE) programs and there is not necessarily a one-size-fits-all solution. Traditional assessments are inefficient in helping programs learn how to leverage their strengths and compensate for their weaknesses. In St. Louis County, Missouri, a team of CTE advisers and administrators arrived at a model for reviewing programs facing this challenge.
The St. Louis County Special School District (SSD) is an overlay district that provides both special education and CTE to its 23 school districts. CTE services are provided through SSD's two technical high schools: North Technical High (1,200 students) and South Technical High (850 students). The district offers 34 career and technical programs with 25 being duplicated at both schools.
SSD's superintendent, John Caw, and his CTE advisory board wanted to use data to evaluate the district's CTE programs. A subcommittee was formed to look into program initiation, evaluation and termination. The superintendent and the board members were looking for a formal program that consisted of a written policy, and subsequent procedures, to evaluate all of the technical education programs.
Doing the Groundwork
First, the subcommittee investigated whether other career centers and/or career districts had a formal system of evaluation for career and technical programs. After contacting Missouri career centers and finding no formal written processes, calls went out to other states' CTE centers. Sandra Royer, the health and consumer sciences supervisor at Miami Valley Career Technology Center in Clayton, Ohio, provided a draft called Disinvestment Guidelines. The guidelines called for programs that fell below a certain level of enrollment to be placed on a watch list. Programs that continue to fall on the list, after analysis and interventions, could be placed on probation. Though there were no formal guidelines, Royer indicated that it was a good place to start a conversation to evaluate their programs.
"The primary purpose of any systematic assessment of school performance ... is to reveal best practices and identify shared problems in order to encourage teachers and schools to develop more supportive and productive learning environments," suggest Schleicher and Stewart (2008, p. 49). Accordingly, the subcommittee liked the idea of a tiered system in evaluating the programs. A program would be placed on a watch list and if things didn't improve the second year, it would be placed on probation. In future years, the program could be terminated if progress is not attained. A duplicated program, such as carpentry, would be evaluated using separate data for each school.
Using the PSI Scorecard
The next step was to find valid and measurable criteria to evaluate the programs. Koretz (2008, p. 19) posits, "A sensible accountability system ... holds people accountable for what they can control." To this end, the subcommittee members started by listing all of the factors they, as stakeholders, thought should be assessed in order to operate a successful CTE program. After much discussion, the subcommittee combined many of the factors and came to consensus on measurable criteria which relate to federal and state accountability standards.
The criteria used to evaluate the programs are called Program Status Indicators (PSI). The subcommittee settled on five major PSI which provide sufficient objective data to analyze a program's success. They are: placement, enrollment, advisory committees, certification, and occupational outlook. What follows are the definitions of the PSI as well as cutoff scores identified for each indicator.
Placement: A combination of job and postsecondary placement is reviewed. Positive Missouri School Improvement Program (MSIP) placements include students:
1. working in their field of study or a related field;
2. continuing postsecondary study; or
3. entering the military. The data is taken from the previous year's 180-day follow-up surveys.
Advisory Committees: Following the newly approved Missouri Workforce Investment Board's guidelines pertaining to the makeup of the body, 51 percent of advisory members must be business/ industry representatives. Business representatives are those in a business with supervisory/decision-making roles. For the 2009-2010 school year there must be at least eight advisers, with five advisers present at an advisory meeting; in the 2010-2011 school year there must be 10 advisers, with six present at an advisory meeting; in the 2011-2012 school year there must be 12 advisers, with seven members present. In order to have a quorum, a majority of the present advisers must be business/industry representatives.
Enrollment: Fifty percent of student enrollment capacity needs to be maintained. The total capacity of a program includes both juniors and seniors. If capacity for the junior class is 20 students and the capacity for the senior class is 20 students, the total capacity is 40 students. Programs need 50 percent enrollment to qualify for Missouri Department of Elementary and Secondary Education reimbursement.
Certification: All programs will be required to have 100 percent of their students take a Technical Skill Assessment (TSA) by the spring of 2012 in order to fulfill the Perkins federal legislation. The U.S. Department of Education has set the pass rate at 62.5 percent for secondary CTE students. The SSD Technical High School currently administers the WorkKeys test to all incoming juniors and exiting seniors. The WorkKeys' Career Readiness Certificates will be used to measure student success for all programs that do not have an established TSA in place.
Occupational Outlook: Either the U.S. Department of Labor's Occupational Outlook Handbook or the Missouri Economic Research and Information Center (MERIC) is consulted to classify occupational projections. MERIC was selected as the primary source since career projections are provided in a grade format: Grade "A"=Excellent Outlook to "F"=Poor Outlook. (There are five grades/levels.)
In addition to establishing a baseline for each indicator, the subcommittee wanted to develop a rubric where all of the PSI could be weighted in order to come up with a composite score. In this way, no single PSI would cause a program to be terminated and all five indicators would be considered when evaluating a program's status. After much discussion and consensus, the following points and percentages were assigned to each PSI:
As a sample exercise using this system, data from the 2007-2008 school year was examined to determine where programs would fall within the rubric (See Figure 1). Within the data chart you are able to see how the PSI are scored for each program. If a program had a placement rate of 100 percent, it would earn all 30 points. If a program had a placement rate of 50 percent, it would earn 15 points in this category. The same principle of proportional scoring applies to the other categories. If enrollment is 50 percent of capacity, then the program would earn 13 of the 25 points (rounded up). For the sake of establishing a total composite score cutoff for the rubric, it is assumed that all programs will have two advisory committee meetings each year with the appropriate number of advisers. If for some reason a program had only one advisory meeting that met the established criteria, then the program would only receive 10 of the 20 points available. The subcommittee set the certification indicator at 70 percent (14 points) only for the sake of establishing a composite rubric score. The outlook indicators range from 5 to 1 based on the A, B, C, D, F grade categories established by MERIC.
After all of the data from each program was put into the sample rubric, there were 11 of the 51 programs evaluated (about 20 percent) that were below the composite cutoff score of 70 percent. The committee judged that this was a good place to start. The cutoff score can always be changed by the CTE advisory board depending upon changes in the PSI. All programs that fall below the cutoff score of 70 will be on the official watch list. Any program on the list is required to write an improvement plan tailored to meet the needs of the identified problem areas. The plan will be written with input from teachers, administrators and advisory team members. All plans will be submitted to the building administrators for approval and forwarded to the superintendent's CTE advisory board.
Programs that are not on the watch list but have one or more PSI which fall below acceptable levels will be required to have an improvement plan submitted to the building administrators. The less than acceptable levels are as follows:
Placement 60 percent Enrollment 50 percent Advisory Meetings 100 percent Certification 62.5 percent
Program Termination is a Three-tiered Process: Watch--Probation--Termination
Each year the CTE advisory board will examine the improvement plans of the programs on the watch and probation lists. Programs that continue to perform below acceptable levels may be recommended for termination by the CTE advisory board to the district's board of education. SSD will implement the PSI model in the upcoming 2009-2010 school year. It is SSD's intention to report back to CTE professionals in three years and provide results of this endeavor. Hopefully, students, educators and businesses will benefit from SSD's attempt to utilize data to improve outcomes based on an array of mutual interests.
The subcommittee members were: John Gaal, director of training and workforce development for the Carpenters' District Council of Greater St. Louis; Jeff Spiegel, superintendent of the Ferguson-Florissant School District in St. Louis County; George Lestina, owner and president of IPC Graphics Printing and Design; Mike Powers, principal of North Technical High School; Kim Ford-Beals, assistant principal of North Technical High School; Randy Barnes, facilitator of technology for the SSD Technical Education Division; Jim Fischer, retired SSD technical education administrator; and Shane Trafton, administrator of curriculum and instruction for the SSD Technical Education Division.
Koretz, D. (2008, Fall). "A Measured Approach." American Educator, 32, 18-27 & 39.
Schleicher, A. & Stewart, V. (2008, October). "Learning from World-class Schools." Educational Leadership, 66, 44-51.
Sternberg, R. (2007, December-2008, January). "Assessing What Matters." Educational Leadership, 65, 20-26.
John Gaal is director of training and workforce development for the Carpenters' District Council of Greater St. Louis. He con be contacted at email@example.com.
Shane Trafton is administrator of curriculum end instruction for the SSD Technical Education Division in St Louis County. He coo be contacted at firstname.lastname@example.org.
Figure 1 Placement 30 percent 30 points Enrollment 25 percent 25 points Advisory Committee 20 percent 20 points Certification 20 percent 20 points Occupational Outlook 5 percent 5 points