Printer Friendly

An internal evaluation of the National FFA Agricultural Mechanics Career Development Event through analysis of individual and team scores from 1996-2006.

Introduction/Theoretical Framework

FFA Career Development Events (CDE) serve as an opportunity for agricultural education students to apply their knowledge and skills of a variety of curriculum and career-related topics as a competitive event and have been conducted as part of the National FFA Convention since 1947 (Smith & Kahler, 1987). By design, CDE are an outgrowth of classroom and laboratory instruction, and skills gained through SAE. Often, CDE are viewed as a motivational tool for student achievement and recognition (Croom, Moore, & Armbruster, 2009). Additionally, all CDE are competitive and most involve a team activity (Talbert, Vaughn, & Croom, 2005) with most events occurring at the local, state, and national level. Recognition for individual and team achievement occurs in the form of plaques, pins, and scholarships. Students prepare for CDE more for leadership development, award recognition, and to obtain skills that will further their career choice, than for the sake of competition (Croom et al., 2009).

The National Agricultural Mechanics Career Development Event (CDE) is one of 24 CDE conducted annually as part of the National FFA Convention. An organizing committee composed of university, secondary, and agriculture industry representatives are responsible for the development, conduct, and evaluation of activities aimed at measuring the agriculture student' knowledge and technical skill areas of agriculture (Beard, 2001). The agricultural mechanic CDE is composed of five system areas, each represented by a knowledge and skill activity. In addition, there is a written exam, and a team activity. The goal of the agricultural mechanics CDE is to assess students' agricultural mechanics competencies important to the modern workplace. The five system areas (along with the written exam and team activity) focus on a specific agricultural theme for a given year. The themes are announced each year and are published in the CDE manual.

The National FFA Agricultural Mechanics CDE is divided into seven sub-event activities: (a) written exam, (b) machinery and equipment systems, (c) industry and marketing systems, (d) energy systems, (e) structural systems, (f) environmental and natural resource systems, and (g) the team activity. Each individual completes the five skill and problem solving areas worth 30 points each (150 points possible), a written exam (100 points possible) and the team activity (250 points possible). Each team member receives one-third of the team activity score. Each team may have four members, but only the scores from the top three members are used to determine total team score (National FFA Organization, 2008).

Research on Career Development Events related to agricultural mechanics has examined student scores on a national-level agriculture mechanics CDE (Buriak, Harper, & Gliem, 1985), prediction of student achievement in a state-level agriculture mechanics CDE related to specific student characteristics (Franklin & Miller, 2005; Johnson, 1991, 1993).

Johnson (1991; 1993) identified several factors contributing to student achievement in state-level agricultural mechanics CDE in Mississippi. A linear combination of average grade received in agriculture classes and farm residence and/or work experience were best predictors of overall student achievement.

Franklin and Miller (2005) reported that grade-level, years in agricultural education, highest math course completed, and achievement on the event written exam were variables that best predicted student achievement in agricultural mechanics CDE in Arizona.

The theoretical foundation of the study is the use of the ADDIE model to conduct an evaluation of instructional materials and training programs ADDIE is the acronym for five phases of the ADDIE Model: Analysis, Design, Development, Implementation, and Evaluation (Dick, Carey, & Carey, 1996; Petersen, 2003). The ADDIE framework is a cyclical process and continues over time. Each stage has a distinct purpose and function (Peterson, 2003). "This approach provides educators with useful, clearly defined stages for the effective implementation of instruction" (2003, p. 227).

[FIGURE 1 OMITTED]

Nearly three decades ago, Buriak et al., (1985) conducted an internal evaluation of the National FFA Agricultural Mechanics CDE using event scores and selected contestant demographic variables. The researchers published an evaluative study of contestants' scores of National FFA Agricultural Mechanics CDE for the years 1979 to 1984. The then--CDE was made up of six problem solving areas (10 points each), six skill areas (25 points each) and a written examination (90 points/ 90 questions, 15 questions from each six skill areas). The researchers sought to determine if a regional bias existed in any sub-contest (event) area based on scores, and if any sub-contest (event) activity contributed a disproportionate share of variance in the overall performance score of the contestant. The researchers concluded that a significant difference in scores existed based on FFA region. According to Buriak et al. (1985):
   This investigation demonstrates the utility
   of contest score evaluation and the need for
   further evaluation. Investigations of the
   prediction value of selected variables could
   prove useful in the development and
   enhancement of the contest. The use of trend
   analysis could explore the progress of
   contestants" scores in the various areas of a
   contest and may indicate areas needing
   particular attention. (p. 32)


A finding of their research was Central FFA Region contestants scored significantly higher than contestants from other FFA regions. The sub-contest (event) activity, written examination accounted for the largest proportion of the total event score variance. The researchers recommended continual evaluation of the event. This study is an attempt to address the evaluation process of this CDE and suggest an evaluation model for possible adaption by other CDE.

Purpose & Objectives

The purpose of this study was to perform an internal evaluation of the National Agricultural Mechanics Career Development Event. An analysis of performance scores for individual activities, team activities, and the total event over a span from 1996 to 2006 was examined.

The evaluation objectives were to:

1. Identify the rankings of the top-ten states participating in the National Agricultural Mechanics CDE from 1996-2006;

2. Describe the contestant scores for the five system-areas, written exam scores and team activity by National FFA regions for the ten year period of 1996 to 2006;

3. Determine if a significant difference exists among national regions based on any activity area or overall event scores; and

4. Determine if a linear combination of sub-event area scores could explain a significant portion of the variance associated with overall individual achievement in the National Agricultural Mechanics Career Development Event.

Research Methods & Procedures

Population and Sample

This study employed a descriptive-correlational research design. The population for this study included all contestants competing in the National Agricultural Mechanics CDE. The sample consisted of those students competing in the event from 1996 to 2006 (N = 1,735). Data per contestant included: year of competition, state, overall rank, team activity score, and individual final score (from 1996 to 2006). Data from the years 2000 to 2006 included the following: machinery and equipment system score, industry and marketing system score, energy system score, structure system score, environmental and natural resource system score, a summated written exam score (all five system area exam scores), team activity score, and overall individual contestant score.

Data Collection

Data were directly converted from Microsoft Excel[R] spreadsheets and entered into SPSS 16.0 for Windows[R] for analysis. Analyses for objectives one, two, and three were conducted using frequencies, percentages, means, and standard deviations. Pearson product-moment and correlation coefficients were calculated, as appropriate to meet objectives four and five. An alpha level of .01 was established a priori to evaluate for statistical significance of all bivariate correlation coefficients. Based upon recommendations (Pallant, 2001) the .01 alpha level was selected as the critical standard for exploratory regression analysis.

The data were analyzed using a General Linear Model Procedure (Buriak et al., 1985) for a simple analysis of variance (ANOVA) when data was compared by region. Alpha was set a priori at p < .01. Post hoc (Tukey HSD) analysis was conducted to determine which regions were significantly different.

Pearson Product-Moment Correlation Coefficients were calculated in an effort to establish the strengths and directions of the relationships between each sub-event skill area and the individual overall event scores. The procedure was accomplished for all contestants with accessible scores. Stepwise multiple regression techniques were used to perform calculations. The researcher assumed the scores represent all of the contestants and teams that participated in the national CDE during the years of interest, therefore the findings are not to be generalized to any other population.

Findings

Evaluation Objective 1

The first objective was to identify the top-ranking states competing in the National Agricultural Mechanics CDE for the years 1996 through 2006. For this analysis, only states finishing in the top-ten placing of each year were presented. The National FFA Organization (2008) groups each state into one of four national regions: Central Region (12 states); Eastern Region (18 states); Southern Region (12 states, Includes Puerto Rico), and Western Region (12 states). The state of Missouri (Central Region) ranked in the top-ten states in all years of analysis, garnering top honors in five of eleven years (1996, 1999, 2003, 2004, & 2006). Teams from the state of Minnesota (Central Region) recorded nine top-ten rankings; twice as national champion (2000 & 2005). Teams from Montana (Western Region) placed nine times in the top-ten, and claimed one national title (1998). Both Iowa (Central Region) and Texas (Western Region) have had teams finish in the top-ten eight times each. Oregon (Western Region) and North Dakota (Central Region) each have placed teams in the top-ten six times. The states of Wisconsin (Central Region), Pennsylvania (Eastern Region), and Illinois (Eastern Region) respectively, each hold one national title, and appeared in the top-ten a minimum of four times; Wisconsin finished five times. Other states with four top-ten finishes include California (Western Region), North Carolina (Southern Region), South Dakota (Central Region), and Washington (Western Region). Kansas (Central Region) and Nebraska (Central Region) each have three top-ten placing, and Connecticut (Eastern Region) and Wyoming (Central Region) each have two. The states of Florida (Southern Region), Georgia (Southern Region), Maryland (Eastern Region), and Oklahoma (Central Region) all have one top-ten finish (Table 1).

A comparison of performance by region reveals Central Region states placed more teams in the top ten over the period from 1996-2006 than any other region (f = 61; 55.5%), and experienced a national winning team more times than any other region (f =8; 72.7%). The Eastern Region tied with Western Region in the number of states finishing in the top ten (f = 21; 19.10%). Eastern Region f =2, 18.20%) led Western Region (f =1, 9.0%) with the number of national champions. Southern Region ranks fourth among the national regions with seven top-ten finishers (6.30%), and no national champions (Table 2).

Evaluation Objective 2

The second objective was to describe the contestant scores for the five system-areas and written exam and team activity scores by National FFA regions for the ten year period of 1996 to 2006. The mean scores and standard deviations of contestant's scores by CDE area by FFA regions are presented in Table 3. The data represents that the means scores associated with the Central Region and Western Region are numerically higher than Eastern Region and the Southern Region in all sub-event areas. Western Region was numerically higher than the Central Region in the sub-event area, structures systems.

Evaluation Objective 3

The third objective of this study was to determine if a significant difference exists among national regions. An analysis of variance (ANOVA) was conducted to determine if the observed differences were statistically significant (Buriak et al., 1985). A one-way between-group ANOVA conducted to examine the impact of national region on the dependent variable on all sub-event area scores and overall individual event scores. There was a statistically significant difference at the p < .01 level in overall individual score for the four different regions (F (3, 1731) = 64.2, p = .01). The effect size, calculated using omega squared, (Field, 2009) was .10. Post-hoc comparisons using Tukey HSD test revealed that mean score for Central Region (M = 212.67, SD = 38.70) was significantly different from Eastern Region (M = 179.01; SD = 45.22), Southern Region (M = 181.94; SD = 41.42), and Western Region (M = 197.49; SD = 41.57). Western Region was significantly different from both Eastern and Southern Regions (Table 4). Table 5 presents the F-values of sub-event areas scores and overall individual event scores by FFA Region. All sub-event area scores were found to be significant.

All sub-event area scores had a significant (p. < .01) positive correlation with overall individual event score. The Pearson correlation analysis revealed a significant, positive relationship between the dependant variable of overall individual event score and all of the independent variables: written exam-total score (.79), machinery and equipment systems (.54), industry and marketing systems (.63), energy systems (.50), structural systems (.52), environmental and natural resource systems (.57) and team activity (.70). Table 6 presents correlation coefficients and effect size for the relationship between each of the six sub-event scores and individual written exam system area components.

As expected, the variable written exam total shows a strong positive correlation with overall individual event score. The written exam total is a composite score (100 points) of the five system areas (20 points each). Further analysis examined the individual components of the written exam. The seven independent variables were further analyzed to determine

if a model could be constructed which would explain a significant portion of the variance associated with overall individual event score. The first step in the process was to determine the inter-correlation between each pair of potential predictor variables (Ferguson, 1981). In order for a variable to serve as a good predictor in a regression model, the variable needs to possess two characteristics: a high correlation with a variable to be predicted and little or no correlation with other potential predictor variables (Pedahazur, 1982). These correlation coefficients are presented in Table 7.

Both the machinery and equipment management system portion (r =.46) and the industry and marketing system portion (r =.48) showed relatively moderate positive correlation with the written exam-total score, and neither reveal a strong correlation with other system portions of the written exam.

Evaluation Objective 4

The aim of the fourth evaluation objective was to explain the proportion of variance in overall individual event scores by the sub-event system area skill and exam portion scores.

The mean of the total overall individual event score was used as the dependent variable in this analysis. The written exam has accounted for the largest proportion of the total variance explained in overall individual event score (Buriak et al., 1985). Since the overall point value of the written exam is higher than the point values of the system area skills, the system area sections that make up the written exam were analyzed to determine how much variance in overall individual event score could be explained by scores on the system area portions of the written exam. Table 9 presents the means, standard deviations and correlations for the five system area portions of the written exam and the overall individual event score.

Four of the five system area skills achieved correlations of greater than 0.6 (r =.6), and showed little correlation with other system skill areas. The four system area skill variables (machinery and equipment system score, industry and marketing system score, energy systems score, and structures system score) were entered into a stepwise, multi linear regression model (Table 11). The full model was significant p = .001). The variable of industry and marketing system score, explained 40% of the variance in the overall individual event score. Structures system exam score accounted for 14.5% unique variance, machinery and equipment system exam score accounted for 7.8% unique variance, and energy system exam score accounted for 3.7% of unique variance. An examination of the residuals showed assumptions were not violated (the lowest tolerance factor = .62, and the highest VIF = 1.61). Table 11 is a presentation of the summary of the dependent variable overall individual event score regressed on each of the independent variables.

Conclusions/ Implications/ Recommendations

Based on the objectives that guided this inquiry and the findings reported the following conclusions were drawn. The first objective was to identify the rankings of the top-ten states participating in the National Agricultural Mechanics CDE from 1996-2006. Previously reported research of national agricultural mechanic CDE scores for the period 1979-1984 (Buriak, et al., 1985) suggested a regional bias existed based on a review of scores by national FFA region performance. Scores of contestants from the Central Region were found to be higher than scores of contestants from the three other national regions. An observation of the present study revealed that on the surface, this trend continued: Central Region states and the state of Missouri in particular, continued to perform at a high level in the national agricultural mechanics CDE. From 1996 to 1998, the national CDE was conducted in the state of Missouri. Did this provide a home field advantage for teams competing from Missouri? Further examination reveals that contestants from the state Missouri continued to score well enough to remain in the top-ten teams nationally after the CDE moved from Kansas City, MO to Louisville, KY in 1999. Are there other variable that should be considered? Are contestants from Missouri coming from the same school or prepared for national competition by the same coach or teacher? Does performance at the state-level CDE have an effect on national-level performance? Other states from the Central Region have success in the CDE: the states of Minnesota, Montana, and Iowa each have placed in the top-ten several times during the period of study.

The mean total event score was 195 points (78%) out of the 250 points. Mean total event scores for the Central Region and Western Region were higher than the Southern and Eastern Region. Central Region scores were higher in all sub-event system areas except Systems-Structures (Western Region). Are states in the Western Region better prepared in skills related to the Structures System skill area?

Does this suggest that teams from the Central and Western regions are better prepared than teams from the Southern and Eastern Region? Are the system area skills to narrow in scope for a national-level CDE? Conclusions from previous agricultural mechanic CDE evaluation recommended that (then) system area Power and Machinery Skills and Construction and Maintenance skills should be "modified to increase the variability of scores" (Buriak, et al., 1985, p. 32), and the written examination accounted for the largest proportion of the total variance explained in overall individual event score. An attempt was made to dissect the written exam total score and analyze the five system area components to determine factors which contribute to success on the written exam portion of the overall individual event score.

The National Research Agenda of Agricultural Education and Communication (Osborne, 2007), RPA 5 Determine the effects of agricultural education instruction, indicated the need to identify the professional development needs of agricultural educators.

This analysis was conducted with all contestant scores in the National Agricultural Mechanics Career Development Event for the ten-year period 1996-2006. Could additional data collected from states such as demographics of contestants (i.e., gender, age, and years of agricultural education) aid researchers in identifying variables that may contribute to student success? Similar research of student performance in agricultural mechanics CDE (Franklin & Miller, 2005; Johnson 1991, 1993) identified level of high school math, years in agricultural education, and grade in agricultural education as potential variables contributing to student success in state-level CDE.

A question raised by this study based on the record of performance of one state, is how state FFA associations certify teams to compete at the national level. Do team members come from the same school, or are they the top contestants from the state-level event?

Agricultural production equipment used for system area skill activities are provided by local producers or regional equipment dealers based on local supply and logistics. Does this present an advantage to teams from states residing within the same geographical region? These findings were generally consistent with the Buriak, et al, (1985) research study. Just as the national Agronomy CDE rotates among five cropping regions with specific plants (National FFA, 2008), the team activity for the agricultural mechanic CDE should consider adopting a regional application while continuing to use the rotational theme of animal systems, plant systems, material handling systems, integrated pest management systems, and processing systems. A scenario describing a specific plant or crop and related equipment may be posted to the national agricultural mechanics CDE Website in mid-summer. Teams have six months to research the information posted on the website and prepare reports for a presentation to a panel of judges on the first day of the event.

Research should be conducted at the FFA region-level to explore the format and contents of state-level agricultural mechanic CDE. Do all states follow the format of the national agricultural mechanic CDE? Do state events follow the annual theme announced at the national level? Does a relationship exist between the number of times a contestant competes in agricultural mechanics CDE within a state and the level of success at the state-level? One challenge of conducting a national-level CDE is developing activities that are relevant and appropriate to contestants in all states across the nation.

A model for conducting evaluations of CDE using ADDIE is proposed. Figure 3 shows the process of moving from each of five stages. The Design stage occurs immediately following the CDE. The following year's theme is announced. Potential system area skills are presented and industry resources identified. During the Development stage, both cognitive and psychomotor competencies are identified and developed by organizing committee. Information posted to the national CDE Website is updated and finalized. During the Implementation stage, activities come together and the written exam is developed. Communication between committee members, industry members, and National FFA staff becomes critical for planning and implementation. The CDE is conducted, scores tabulated, analyzed, and awards & recognition occurs. The Evaluation stage occurs both during and after the CDE. During a meeting conducted with the coaches and teachers, suggestions for future skill activities are solicited by the committee members. A contestant evaluation survey is administered during the final round of the system area skill activity on the last day of the CDE. Finally, the organizing committee meets with members of National FFA Organization prior to the end of the five-year revision process. Analysis is the final stage where industry members communicate with organizing committee members to identify relevant and rigorous competencies to update the CDE handbook.

[FIGURE 3 OMITTED]

Future analysis of national agricultural mechanic CDE scores should continue and follow the five-year national CDE revision process to determine if recent changes made in the event format and system area scoring have any effect on student performance. The review should begin with scores obtained during the period of 2007 to 2011. A study of the performance of states from the Central Region should be conducted to determine factors that have contributed to student success. Perhaps an ethnographic study of the coaches and advisors responsible for training successful teams may reveal effective training methods employed to prepare students for national competition. A question to ask is should the CDE organizing committee consider developing a regional rotation of themes, similar to the national Agronomy CDE? For the team activity, should a scenario be developed reflective of a regional agricultural mechanics situation for teams to research, address, and present at the national event? Additional research should address the percentage of teachers using the agricultural mechanics CDE Website hosted by the University of Missouri for locating updated information about forthcoming national competition.

Further analysis should be conducted to determine if undetermined variables such as contestant age, gender, years of agricultural education, supervised agricultural education (SAE) experience level of preparedness, experience of instructor/coach, following directions, or group order in event rotation have any effect on individual or team performance. This data can be gathered as part of the post-CDE evaluation administered to the contestants during the final round of the system area skill activity.

Event superintendents of the 23 other national CDE should consider conducting a similar internal evaluation of individual and team scores to determine trends and patterns which may be evident in their own CDE.

10.5032/jae.2012.01095

References

Beard, R. (2001). Evaluating knowledge in agricultural mechanics. The Agricultural Education Magazine 73(5) 16-17.

Buriak, P., Harper, J., & Gliem, J. A. (1985). Analysis of contestant's scores in the National FFA Agricultural Mechanics Contest 1979-1984. Journal of American Association of Teacher Educators of Agriculture 27(2) 27-33. doi: 10.5032/jaatea.1986.02027

Cohen, J. (1988). Statistical power analysis for the behavior sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum.

Croom, B., Moore, G., & Armbruster, J. (2009). An examination of student participation in national FFA career development events. Journal of Southern Agricultural Education Research, 59(1), 112-124.

Dick, W., Carey, L., & Carey, J. O. (2001). The systematic design of instruction (5th ed.). New York, NY: Longman.

Ferguson, G. A. (1981). Statistical analysis in psychology and education. New York, NY: McGraw-Hill.

Franklin, E.A., & Miller, G. M. (2005). Student achievement and predictors of student achievement in a state level agricultural mechanics career development event. Proceedings of the 24th Annual Western Region American Association for Agricultural Education Research Conference, p. 15-27.

Hinkle, D. E., Wiersma, W., & Jurs, S. G. (1979). Applied statistics for the behavioral sciences. Chicago, IL: Rand McNally College Publishing.

Johnson, D. (1991). Student achievement and factors related to achievement in a state FFA agricultural mechanics contest. Journal of Agricultural Education, 32(3) 23-28. doi: 10.5032/jae.1991.03023

Johnson, D. (1993). A three year study of student achievement and factors related to student achievement in a state FFA agricultural mechanics contest. Journal of Agricultural Education, 34(4) 39-45. doi: 10.5032/jae.1993.04039

National FFA Organization. (2008). National FFA career development events handbook. Retrieved from http://www.ffa.org/documents/cde_handbook.pdf

Newcomb, L. H., McCracken, J. D., Warmbrod, J. R., & Whittington, M. S. (2004). Methods of teaching agriculture. Upper Saddle River, NJ: Pearson-Prentice Hall.

Osborne, E. W. (Ed.). (2007). National research agenda: Agricultural education and communication, 2007-2010. Gainesville, FL: University of Florida, Department of Agricultural Education and Communication.

Pallant, J. (2001). SPSS survival guide. Philadelphia, PA: Open University Press.

Pedhazur, E. J. (1997). Multiple regression in behavioral research. Ft. Worth, TX: Harcourt Brace College Publishers.

Peterson, C. (2003). Bringing ADDIE to life: Instructional design at its best. Journal of Educational Multimedia and Hypermedia, 12(3), 227-241.

Rayfield, J., Fraze, S., Brashears, T., & Lawver, D. (2009). An assessment of the recruitment and training practices used in a national FFA career development event. Journal of Southern Agricultural Education Research, 59(1), 84-96.

Smith, M. W., & Kahler, A. A. (1987). Needed: Educational objectives and administrative criteria for national FFA contests. The Journal of the American Association of Teacher Educators in Agriculture, 28(2) 45-50. doi: 5032/jaatea.1987.02045

Talbert, B. A., Vaughn, R., & Croom, D. B. (2004). Foundations of agricultural education. Catlin Il.: Professional Educators Publications, Inc.

Edward A. Franklin, Associate Professor

University of Arizona

James Armbruster, Senior Relationship Manager

National FFA Organization

EDWARD A. FRANKLIN is an Associate Professor of Agricultural Education in the Department of Agricultural Education at the University of Arizona, 1110 E. South Campus Drive, 205 Saguaro Hall, Tucson, AZ 85721, eafrank@ag.arizona.edu.

JAMES ARMBRUSTER is Senior Relationship Manager with National FFA Organization, 6060 FFA Drive-P.O. Box 68960 Indianapolis, IN 46268 jarmbruster@ffa.org
Table 1
Rankings Of Top Ten States and Region Affiliation
In National Agricultural Mechanics CDE From 1996-2006

Rank     1996     1997     1998     1999

1st     MO (a)   PA (b)   MT (a)   MO (a)
2nd     TX (d)   MN (a)   MO (a)   IA (a)
3rd     MN (a)   MO (a)   IA (a)   ND (a)
4th     CT (b)   OR (d)   OK (a)   TX (d)
5th     OR (d)   IA (a)   NE (a)   KS (a)
6th     IA (a)   SD (a)   TX (d)   MN (a)
7th     IL (b)   WI (a)   WI (a)   MT (a)
8th     VA (b)   OH (b)   OH (b)   MD (b)
9th     MT (a)   WA (d)   NC (c)   AL (c)
10th    CA (d)   NE (a)   CA (d)   SD (a)

Rank     2000     2001     2002     2003

1st     MN (a)   WI (a)   IL (b)   MO (a)
2nd     WA (d)   MO (a)   TX (d)   MN (a)
3rd     TX (d)   IA (a)   IA (a)   WA (d)
4th     PA (b)   MT (a)   ND (a)   ND (a)
5th     WY (a)   TX (d)   WA (d)   IL (b)
6th     WI (a)   CA (d)   MT (a)   FL (c)
7th     MT (a)   OR (d)   OH (b)   PA (b)
8th     KS (a)   KS (a)   MN (a)   MT (a)
9th     OR (d)   NE (a)   SD (a)   OR (d)
10th    MO (a)   WY (a)   NC (c)   SD (a)

Rank     2004     2005     2006

1st     MO (a)   MN (a)   MO (a)
2nd     TX (d)   TX (d)   ND (a)
3rd     IA (a)   IL (b)   NC (c)
4th     MN (a)   MO (a)   CT (b)
5th     KS (a)   NY (b)   MT (a)
6th     WI (a)   ND (a)   PA (b)
7th     MT (a)   NC (c)   OH (b)
8th     IN (b)   CA (d)   IA (a)
9th     OH (b)   KY (b)   MN (a)
10th    ND (a)   PA (b)   GA (c)

Note: (a) Central Region; (b) Eastern Region;
(c) Southern Region; (d) Western Region,
Source: National FFA Organization, 2008.

Table 2
Comparison Top-Ten State
Placings by National FFA Regions

             Champions      Top-Ten
                            Finishes

Region       f       %      f       %

Central      8     72.7    61    55.5
Eastern      2     18.2    21    19.1
Western      1      9.0    21    19.1
Southern     0      0.0     7     6.3
Total        11    100.0   110   100.0

Table 3
Means and Standard Deviations of Contestant
Scores by Sub-Event Area by FFA Region for 1996-2006

                     Central Region          Eastern Region

Sub Event Areas     M       SD      n       M       SD      n

Exam (a)          64.70    11.18   305    55.73    12.75   423
M&E (b)           21.03    6.44    305    17.24    7.17    423
I&M (c)           15.91    8.03    305    12.38    8.91    423
Energy (d)        19.44    8.17    305    17.29    8.30    423
Struct. (e)       18.38    5.75    305    15.20    6.04    423
E&NR (f)          21.51    7.20    305    17.84    7.77    423
Team (g)          56.88    13.18   305    46.09    18.33   423
Total             217.28   34.98   305    181.32   43.74   423

                     Southern Region         Western Region

Sub Event Areas     M       SD      n       M       SD      n

Exam (a)          56.20    12.24   196    61.36    12.65   227
M&E (b)           18.08    7.32    196    18.65    7.21    227
I&M (c)           12.34    8.24    196    13.11    8.62    227
Energy (d)        17.68    8.50    196    18.69    8.60    227
Struct. (e)       16.19    6.20    196    19.28    5.80    227
E&NR (f)          19.04    7.47    196    19.63    7.77    227
Team (g)          46.19    16.42   196    51.72    13.71   227
Total             185.26   41.55   196    201.93   38.00   227

                          Total

Sub Event Areas     M       SD      N

Exam (a)          59.30    12.82   1151
M&E (b)           18.66    7.17    1151
I&M (c)           13.46    8.63    1151
Energy (d)        18.20    8.40    1151
Struct. (e)       17.01    6.17    1151
E&NR (f)          19.37    7.70    1151
Team (g)          50.08    16.73   1151
Total             195.59   42.75   1151

Note. System area scores are out of 30 points;
written exam total is 100 points; team activity is
250 points total. (a) Written Exam Total; Total;
(b) Machinery and Equipment System;
(c) Industry and Marketing Systems;
(d) Energy Systems; (e) Structures System;
(f) Environmental and Natural Resource System;
(g) Team Activity

Table 4
One-Way Analysis of Variance for Comparison of
Individual Overall Event Scores by FFA Region

Source            df         SS              MS

Between Groups     3     342678.58       114226.20
Within Groups    1731    3081858.72       1780.39
Total            1734    3424537.31

Source             F         p        [[omega].sup.2]

Between Groups   64.16      .000            .10
Within Groups
Total

** p<.01

Table 5
F-Values Generated by Analysis of Variance of
Individual Sub-Event Scores & Overall Individual Event
Scores by FFA Region

Activities                                        F-Value **

Written Exam Total                                  38.08
  Machinery and Equipment Systems Exam              22.08
  Industry and Marketing Systems Exams              16.89
  Energy Systems Exam                               18.45
  Structural Systems Exam                           27.62
  Environmental & Natural Resource Systems Exam     19.65
Machinery and Equipment Systems                     17.74
Industry and Marketing Systems                      11.94
Energy Systems                                       4.42
Structural Systems                                  30.84
Environmental & Natural Resource Systems            14.05
Individual Team Activity Score                      31.36
Overall Individual Event Score                      53.87
Note. ** p < .01

Table 6
Relationship Between Sub-Event Area Scores
And Overall Individual Event Score

Sub-Event Areas                               r       Effect Size (a)

Written Exam Total                          79 **     High correlation
  Machinery and Equipment Exam              .61 **  Moderate correlation
  Industry and Marketing Exam               .63 **  Moderate correlation
  Energy Systems Exam                       .63 **  Moderate correlation
  Structural Systems Exam                   .62 **  Moderate correlation
  Environ. and Natural Resource Sys Exam    .49 **    Low correlation
Machinery and Equipment Systems             .54 **  Moderate correlation
Industry and Marketing Systems              .63 **  Moderate correlation
Energy Systems                              .50 **  Moderate correlation
Structural Systems                          .52 **  Moderate correlation
Environmental and Natural Resource Systems  .57 **  Moderate correlation
Team Activity (a)                           .70 **    High correlation

Note. (a) Team activity score is composite of three team members.
** p < .01

Table 7
Inter-Correlations Between Potential Predictor
Variables, Written Exam Total Score,
System Area Skills, and Team Activity

     Measure                             1        2        3

1.   Written Exam-Total                  -
2.   Mach and Equip Mgnt. Systems      46 **      -
3.   Industry and Mark. Systems        48 **    .24 **     -
4.   Energy Systems                    24 **    .18 **   .27 **
5.   Structural Systems                37 **    .27 **   .25 **
6.   Environ. and Nat. Res. Systems    40 **    .35 **   .20 **
7.   Team Activity                     .38 **   .13 **   .30 **

     Measure                             4        5        6      7

1.   Written Exam-Total
2.   Mach and Equip Mgnt. Systems
3.   Industry and Mark. Systems
4.   Energy Systems                      -
5.   Structural Systems                .21 **     -
6.   Environ. and Nat. Res. Systems    .26 **   .17 **     -
7.   Team Activity                     .18 **   .25 **   .24 **   -

Note. ** p< .01

Table 8
Analysis of Variance for the Stepwise Multiple
Regression Analysis of Overall Individual Event Written
Exam System Area Scores

Source         df        SS          MS         F         p

Regression     5     1392546.89   278509.38   443.04   .000 (a)
Residual      1145   719782.32     628.63
Total         1150   2112329.21

Note. (a) Predictors: (Constant) E&NR exam score,
Indus exam score, Struct exam score, Mach exam score,
Energy exam score.

Table 9
Means, Standard Deviations, and Correlations
for Overall Individual Event Score and Written Exam
System Area Predictor Variables

Variable                M        SD       1        2

Overall Individual    192.07   44.44    .62 **   .63 **
Event Score
Predictor variable
  1. Mach Score       12.51     3.25      -
  2. Indus Score      13.22     2.98    .47 **     -
  3. Energy Score     11.34     3.26    .43 **   .49 **
  4. Struct Score     12.10     3.46    .39 **   .44 **
  5. E & NR Score     10.13     4.26    .45 **   .39 **

Variable                3        4        5

Overall Individual    .63 **   .62 **   49 **
Event Score
Predictor variable
  1. Mach Score
  2. Indus Score
  3. Energy Score       -
  4. Struct Score     .53 **     -
  5. E & NR Score     .43 **   .43 **     -

Note: ** p<.01.

Table 10
Summary of Stepwise Regression Analysis for
System Area Variables Explaining Overall
Individual Event Score (N=1151)

Variable                    B     SE B   [beta]     t      p

Industry and Marketing    3.80    .31     .27     12.42   .00
  System Score
Machinery and Equipment   3.75    .29     .26     12.43   .00
  System Score
Energy System Score       3.19    .29     .26     13.91   .00
Structure System Score    3.25    .26     .28     11.08   .00
(Constant)                23.06

Note. Full Model: [R.sup.2] =.40; Adjusted
[R.sup.2] =.40; F = 551.75; p =.001
COPYRIGHT 2012 American Association for Agricultural Education
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2012 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Franklin, Edward A.; Armbruster, James
Publication:Journal of Agricultural Education
Date:Jan 1, 2012
Words:5896
Previous Article:Resilient agricultural educators: taking stress to the next level.
Next Article:Effects of mathematics integration in a teaching methods course on mathematics ability of preservice agricultural education teachers.
Topics:

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters