Printer Friendly

A confirmatory factor analysis of the School Administrator Efficacy Scale (SAES).


Efficacy refers to peoples' beliefs and confidence to execute actions to attain a specific goal. Although knowledge of efficacy is well developed regarding students' learning and teachers' success, there is almost no research on efficacy of school administrators. However, McCollum, Kajs, and Minter (2006) developed a model and measure of school administrators' efficacy. The scale was based on the Educational Leadership Constituent Council (ELCC) national standards. Through exploratory factor analysis, eight dimensions of school administrator efficacy were derived. They included (1) Instructional Leadership and Staff Development, (2) School Climate Development, (3) Community Collaboration, (4) Data-based Decision Making Aligned with Legal and Ethical Principles, (5) Resource and Facility Management, (6) Use of Community Resources, (7) Communication in a Diverse Environment, and (8) Development of School Vision. This scale is titled School Administrator Efficacy Scale (SAES). In the present study, a model of the SAES was tested using a confirmatory factor analysis and hypothesizing the eight dimensions of the scale posited by its original researchers. The hypothesized model was tested using a sample of 559 principals and principal trainees. This research advances the knowledge and measurement of school administrators' efficacy, as additional evidence of the scale's validity and reliability is provided.


Consider two assistant principals early in their careers, Carl and Sandra. Both have graduated with Master's degrees from respectable institutions. From their course work, one could assume that both have the necessary knowledge, skills, and dispositions to perform their new jobs. Same skill set, same level of experience, yet in short time, Sandra is outperforming her peer, Carl. Sandra works better with the community, is more effective in handling diversity, more adept in addressing instructional issues, and overall makes better decisions in the administration of her school. What sets these two assistant principals apart? Carl often finds himself thinking that he cannot get the job done, while Sandra stands tall and is confident in the knowledge, skills, and dispositions she has gained. In terms of Social-Cognitive Theory (Bandura, 1986), Sandra is self-efficacious, while Carl lacks self-efficacy. Efficacy refers to "peoples' judgments of their capabilities to organize and execute courses of action required to attain designated types of performances" (Bandura, p. 391). Efficacy can impact one's performance range on a variety of tasks as well as one's motivational level for self-improvement to better engage in effective school practices (Schultz & Schultz, 1998). These elements are especially important for carrying out responsibilities of instructional leadership, since "Principal quality is linked statistically and practically to student achievement" (Kaplan, Owings, & Nunnery, 2005, p. 43).

The present article briefly summarizes self-efficacy as studied in students and teachers. From there, the pertinent, though scarce, literature on school administrators' (e.g., principals') efficacy is reviewed. Then, a recently developed model of a school administrator efficacy scale is posited. The model is theoretically based on Bandura's (1986) Social-Cognitive Theory, and its associated measure stems from the Educational Leadership Constituent Council's (ELCC's) national standards. The theoretical model has been empirically tested by McCollum, Kajs, and Minter (2006) and is further tested, along with its measure--the School Administrator Efficacy Scale (SAES) - in the present article.


Research supports the claim that self-efficacy is a key construct in education. Self-efficacy is related to persistence, effort, and success on tasks (Bandura, 1986). Covington (1984) asserts that one's successes can lead to greater efforts to accomplish and persevere through difficult assignments. Thereby, achievement on tasks brings about self-efficacy, which in turn leads to greater success. In addition, self-efficacy can be increased through vicarious experiences (Bandura). That is, seeing the successful actions of others in their field can influence persons' beliefs that they too can master similar tasks (Bandura). Therefore, observing their mentors' effective performances can work to build up proteges' self-efficacy.

The importance of the efficacy construct has been made apparent as it relates to students and increasingly as it applies to teachers (e.g., Tschannen-Moran, Woolfolk-Hoy & Hoy, 1998). Henson, Kogan, and Vacha-Haase (2001) have concluded that several positive behaviors of teachers (e.g., ability to deal more effectively with failing students, persistence when difficulties arise) are linked to teacher efficacy, which positively impact student outcomes. Self-efficacy research, however, has mostly surrounded teachers and students, with little knowledge of the construct's use to understand the actions of school administrators, e.g., principals. Given the benefits of being efficacious (e.g., greater effort, persistence, and success), this construct may have extensive promise in the study of administrators who work in schools.


More study is needed about the implications of efficacy on school administrators. For instance, do efficacious principals more effectively handle job stress, as well as relationships with staff, students, and parents? Do school administrators, who demonstrate high efficacy, employ more effective management practices, e.g., organizational planning, problem solving, and community building? These issues are critical in the face of the changing roles of school administrators and recent changes in their preparation.

Compared to the traditional approaches to school administrator preparation, five major shifts seem to have occurred for the current preparation of school administrators, including an emphasis on (a) interpersonal skills, (b) consensus development, (c) accountability processes, (d) integration of community and school needs and resources, and (e) policy development (NPBEA, 2002-a). It is understandable that communication has become a major focus since principals can spend up to 80 percent of their time on communicating with students, campus staff, parents, and the larger community (Green, 2001). The necessity in having school administrators who can demonstrate effective communication and social skills to address conflict resolution and consensus building situations in the school and larger community is supported by field research on principals (Kajs, Decman, Cox, Willman, & Alaniz, 2002). Because of the added attention to state and national accountability mandates, e.g., No Child Left Behind (NCLB), the shift to have principals more thoroughly prepared in accountability processes is also understandable.

Research is needed to determine efficacy levels in potential school administrators as well as in current ones. Two studies on principal efficacy include Dimmock and Hattie (1996) who found efficacy as a valued element for principals in a school restructuring process, and Smith, Guarino, Strom and Adams (2006) who concluded that the quality of teaching and learning is influenced by principal efficacy. Some principal efficacy measures that have been developed include the (1) Principal Self-Efficacy Survey (PSES) (Smith et al.); (2) Principals Sense of Efficacy Scale (PSES) (Tschannen-Moran & Gareis, 2004); (3) principal efficacy vignettes (Dimmock & Hattie); and (4) School Administrator Efficacy Scale (SAES) (McCollum, Kajs, & Minter, 2006). Smith et al. measured efficacy in Instructional Leadership and efficacy in Management. Those authors offered construct validity evidence in the form of factor analysis. Though their measure may be promising in terms of validity, it only captures two dimensions of the principal's job. More thorough in their investigation were Tschannen-Moran and Gareis (2004) who tested multiple measures of principals' efficacy. Those authors concluded that Dimmock and Hattie's (1996) measure was neither valid nor reliable and, therefore, could not be used in further studies or in practice. However, Tschannen-Moran and Gareis found some factorial validity and reliability for the scale they created - the Principal Sense of Efficacy Scale (PSES). Those authors captured three dimensions of the principal's job (i.e., management, instructional leadership, and moral leadership). Though their instrument is promising in terms of its psychometric properties and it expands beyond the work of Smith et al., there may be potential to further capture the administrator efficacy construct by identifying additional dimensions of the job. McCollum, Kajs, and Minter developed a scale to measure school administrators' (e.g., principals) efficacy. Those authors noted the construct (factorial) validity of the scale and high reliability coefficients for its subscales. Through exploratory factor analysis, eight dimensions of school administrator efficacy were derived. These dimensions included (1) Instructional Leadership and Staff Development, (2) School Climate Development, (3) Community Collaboration, (4) Data-based Decision Making Aligned with Legal and Ethical Principles, (5) Resource and Facility Management, (6) Use of Community Resources, (7) Communication in a Diverse Environment, and (8) Development of School Vision (McCollum, Kajs, & Minter).

The scale created by McCollum, Kajs, and Minter (2006) was based on the ELCC national standards. These standards incorporate the well-known Interstate School Leaders Licensure Consortium (ISLLC) standards (Murphy, 2005). ELCC's leadership framework provides a roadmap for university-based educational administrator preparation programs regarding specific knowledge, skills, and depositions related to key themes in the development of school principals and superintendents (NPBEA, 2002-a). The work of Kaplan, Owings, and Nunnery (2005) noted the link between principal quality and ISLLC (ELCC) standards. Results indicated that principals who demonstrated higher ratings on these standards were leaders of schools with higher achievement among students; in contrast to school administrators who scored lower (Kaplan, Owings, & Nunnery, 2005). Their study noted that competent teachers want to work with effective principals, not ineffective ones; thus, the quality of leadership can directly impact teacher retention levels (Kaplan, Owings, & Nunnery).

The current ELCC Standards consists of seven standards toward the preparation of school administrators. Standards one through six (1-6) address the chief components of school administration (e.g., community communications and collaboration), while standard seven (7) focuses on applying and synthesizing content, skills, and dispositions outlined in standards one through six (1-6) through an internship experience. The seven ELCC Standards are outlined in Table 1 (NPBEA, 2002-b, pp. 1-18):

The SAES and its eight dimensions provide the model and measure for the present study. McCollum, Kajs, and Minter (2006) cite that this scale can serve as a useful tool in the development of future and current school leaders since subscales address knowledge, skills, and dispositions incorporated in the ELCC Standards, especially since there are a few studies related to principals' efficacy and its measurement.


Given that efficacy is an understudied construct in the domain of school administration, a need exists to develop high quality instruments to measure the construct, as well as to study the construct further using such instrumentation. Establishing the validity of a measurement instrument is a key process in the development of good instrumentation. Benson (1998) offers three stages of construct validation: (1) substantive, (2) structural, and (3) external. In the substantive stage, constructs are theorized and defined. In the structural stage, relationships among variables purported to measure the construct are sought. Such techniques as exploratory and confirmatory factor analysis and internal consistency measures (e.g., Cronbach's Alpha) are utilized. The external stage incorporates the construct's relation to other constructs (i.e., creating the nomological network, see Cronbach & Meehl, 1955). This study focuses on advancing the structural stage of Benson's plan for developing the construct validity of the SAES. Previous research (i.e., McCollum, Kajs, & Minter, 2006) has addressed the substantive stage and has only begun to address the structural stage of construct validation.


Previous research has established initial evidence of the reliability and construct validity of the SAES (i.e., McCollum, Kajs, & Minter, 2006). Still, further evidence is needed to support the scale's construct validity; in previous work only exploratory factor analytic techniques were used. This study tests the SAES model using a confirmatory factor analysis and hypothesizing the eight dimensions of the scale posited by its original researchers. This study is being conducted to determine the construct validity of the SAES, and to lead to improvements in the measurement of school administrators' efficacy. In addition, the internal consistency of the scale will be re-evaluated in this new sample, using Cronbach's Alpha. This study serves to advance the knowledge and measurement of school administrator efficacy.



The study participants were early career principals and principal trainees (n = 559). The participants were teaching in school districts or carrying out principal functions in the Houston, Texas area. The participants' mean teaching experience was 7.8 years (SD = 5.72). The mean experience as a principal was 7.8 months (SD = .41). The participants' mean age was 34.8 years (SD = 7.77). The sample consisted of 79.4 percent females and 20.6 percent males. The study sample was 51.2 percent European-American, 22.9 percent African-American, 22.5 percent Hispanic, 1.3 percent Asian, 1.1 percent biracial people, .4 percent Latino, .2 percent Native American, and .4 percent other.


The SAES uses 51 items (see Appendix for items) and is purported to measure eight dimensions of school administrators' efficacy, using a seven-point Likert-type scale (1 = not at all true of me, 7 = completely true of me). The eight dimensions and their reliability coefficients (Cronbach's Alpha) based on McCollum, Kajs, and Minter (2006) are (1) Instructional Leadership and Staff Development (.93), (2) School Climate Development (.93), (3) Community Collaboration (.91), (4) Data-based Decision Making Aligned with Legal and Ethical Principles (.93), (5) Resource and Facility Management (.89), (6) Use of Community Resources (.95), (7) Communication in a Diverse Environment (.81), and (8) Development of School Vision (.86). These dimensions were derived through exploratory factor analysis. Hence, some initial evidence of construct validity exists. The content validity of the SAES instrument comes from its base in the Educational Leadership Constituent Council (ELCC) national standards. This scale is young, but given its initial validity and reliability evidence, there is promise that the research conducted in this current study will lead to clear construct validity evidence; thereby, furthering the measurement of school administrator efficacy.


The SAES was given in group administrations to the 559 principals and principal trainees in the sample. Participants filled out a consent form, acknowledging their participation, and were provided a set of instructions for completing the SAES and a copy of the instrument. The SAES took approximately 20 minutes to finish.


The hypothesized model for the school administrator efficacy dimensions (see Appendix for listing of items by dimension) was tested using confirmatory factor analysis in EQS 6.1. The normalized estimate of multivariate kurtosis was found to be high (180.71), so robust maximum likelihood estimation was used in parameter estimation. According to Hu, Bentler, and Kano (1992) in cases of high multivariate kurtosis, typical of item data, a robust estimation method is desirable for corrective purposes. Upon completion of parameter estimation, the Bentler-Bonnett nonnormed fit index (NNFI), robust comparative fit index (RCFI--a robust calculation of CFI), root mean squared residual (SRMR) and root mean squared error of approximation (RMSEA) were selected among fit indices to evaluate the model's goodness of fit. These indices performed well in simulation studies and yielded complimentary information (i.e., Hu & Bentler, 1998, 1999; Hutchinson & Olmos, 1998; Marsh, Balla & Hau, 1996). Furthermore, it is a composite of these four criterions that is typically used when evaluating model fit/testing a hypothesis using confirmatory factor analysis.

The NNFI of .90 and RCFI of .90 were at the .90 standard for acceptable fit given by Bentler (1992). The SRMR of .06 was in a good range for fit, below the acceptable point for model fit of .08 (seeking values less than .08). As well, the RMSEA of .05 was in a good range for fit, given the standard of less than or equal to .08. Hu and Bentler (1998, 1999) gave the standard for SRMR, and Browne and Cudeck (1993) gave the standard for RMSEA. All four criteria--NNFI, RCFI, SRMR and RMSEA--suggest that the model fits. Therefore, the evidence in support of the model is strong. The Path Coefficients, Error Variances, and Variances Accounted For (R2), item Means and item Standard Deviations are presented in Table 2.

SPSS 12.0 was used to calculate reliability coefficients (Cronbach's Alpha) for the SAES subscales and the correlations among the subscales. All of the correlations are statistically significant at the p = .01 level; however, they are low enough to warrant the conclusion that the subscales are separate. The subscale means, standard deviations, correlations, and reliabilities are shown in Table 3.


The null hypothesis of non-model fit is appropriately rejected based on the criterion typically used with confirmatory factor analysis. Therefore, the research hypothesis that the eight-factor model of the SAES fits is accepted. Thus, there exists strong evidence of the construct validity of the SAES. Additionally, the correlations among the subscales are low enough to warrant a conclusion that discriminant validity exists (e.g., the subscales are separate). The reliability coefficients of the eight subscales of the SAES were found to range from good (.81) to excellent (.94). These are important steps in the successful measurement of school administrators' efficacy. The findings regarding the construct validity and reliability of the SAES are consistent with and a significant addition to past findings (e.g., McCollum, Kajs, & Minter, 2006). With these new findings, researchers and practitioners can confidently rely on the validity and reliability of the SAES in their work.

Since school administrators as instructional leaders are responsible for student achievement, they need to participate in meaningful assessment practices to address their professional growth needs (Kaplan, Owings, & Nunnery, 2005). The SAES, which incorporates ELCC knowledge, skills and dispositions, has multiple assessment applications in the preparation of school administrators and the ongoing professional development of school principals. First, university principal preparation programs can use the SAES as a formative and summative instrument (i.e., pretest, mid-test, and posttest) to measure the growth and development of candidates as they progress during the program. Particularly, during the internship stage, the principal-mentor and candidate could use this scale as a diagnostic instrument to confirm areas of strength and to develop an action plan to address content, skill, and disposition needs.

Secondly, this scale can be used to evaluate the success of a principal preparation program, when coursework is aligned to the ELCC standards. For instance, the SAES is currently being used as one of a variety of strategies to monitor and evaluate the development of candidates in the University of Houston-Clear Lake's Collaborative Bilingual Administrator Training (CBAT) program and to gauge the program's curriculum and delivery mechanisms in competently preparing future principals. CBAT is a federally funded five-year project to prepare highly qualified bilingual school principals who can serve the needs of English Language Learners (ELLs) in the Houston Metropolitan area.

Thirdly, the SAES can be used as a self-assessment instrument for practicing school administrators, enabling them to review and reflect upon their own strengths and needs as efficacious professionals. Results from this reflective, self-monitoring, standards-driven procedure can provide the basis for an individualized improvement plan. The process can be especially useful since principals are usually expected to outline a yearly professional development program, underscoring the lifelong learning mind-set of professionals. The practice of self-assessment can reduce reluctance among educators to be evaluated because it places them in charge of the information about their individual needs or perceived weaknesses; thus, eliminating public exposure of shortcomings (Jackson, 2005). Individual management of personal information encourages educators to be candid in assessing their knowledge and skill levels, as well as dispositions. Moreover, this confidential practice, as well as the opportunity to choose the relevant training programs to address needs could serve to increase their motivational level to participate and achieve success in their educational experiences (Schultz & Schultz, 1998). The cognitive processes of self-evaluating, self-supervising, and self-motivating, along with goal-development, planning, attention management, implementation of learning approaches, and solicitation of assistance from others when necessary comprise self-regulated learning, which is a key element in becoming an effective learner and leader (Ormrod, 2003).

The research in the present article can be used in the development of school leaders, as the scale addresses knowledge, skills, and dispositions described in the ELCC Standards, applicable to the effective training of school administrators. The SAES instrument can also serve as a viable tool for school administrators to self-evaluate their own strengths and needs as professionals, providing direction toward their professional improvement. Consequently, the SAES has potential to be used for multiple purposes of evaluation. The SAES is especially pertinent, considering that few self-efficacy assessments designed specifically for school administrators are available (Lashway, 2003). Future research addressing the relationships to and comparisons with Tschannon-Moran's and Gareis's (2004) PSES should be made, because both the SAES and PSES are psychometrically strong. It should be determined if those two instruments provide the same or complimentary information. Future research on the SAES should also be conducted to establish how the constructs it measures relate to other constructs, such as motivation and work performance. This would further the validity program described by Benson (1998). Overall, the SAES is promising as a tool to validly and reliably measure school administrator efficacy.


Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice Hall.

Benson, J. (1998). Developing a strong program of construct validation: A test anxiety example. Educational Measurement: Issues and Practice, 17(1), 10-17.

Bentler, P. M. (1992). On the fit of models to covariances. Psychological Bulletin, 88, 588-606.

Browne, M. W., & Cudeck, R. (1993) Alternative ways of assessing model fit. In K.A. Bollen & J.S. Long (Eds.), Testing Structural Equation Model (pp. 132-162). Beverley Hills, CA: Sage.

Covington, M. V. (1984). The self-worth theory of achievement motivation: Findings and implications. The Elementary School Journal, 85(1), 1-20.

Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin, 52, 281-302.

Dimmock, C., & Hattie, J. (1996). School principals' self-efficacy and its measurement in the context of restructuring, School Effectiveness and School Improvement, 7(1), 62-75.

Green, R. L. (2001). Practicing the art of leadership: A problem-based approach to implementing the ISLLC standards. Columbus, OH: Merrill Prentice Hall.

Henson, R. K., Kogan, L. R., & Vacha-Haase, T. (2001). A reliability generalization study of the teacher efficacy scale and related instruments. Educational and Psychological Measurement, 61(3), 404-420.

Hu, L., Bentler, P. M., & Kano, Y. (1992). Can test statistics in covariance structure analysis be trusted? Psychological Bulletin, 112, 351-362.

Hu, L., & Bentler, P. M. (1998). Fit indexes in covariance structure modeling: Sensitivity to underparametrized model misspecification. Psychological Methods, 3(4), 424-53.

Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 11-55.

Hutchinson, S. R. & Olmos, A. (1998). Behavior of descriptive fit indexes in confirmatory factor analysis using ordered categorical data. Structural Equation Modeling, 5(4), 344-364.

Jackson, L. (2003). Teacher training: Is your staff development program working. Retrieved February 17, 2006, from Education World Web site: teacher_training005.shtml

Kajs, L. T., Decman, J. M., Cox, C., Willman, E., & Alaniz, R. (2002). Addressing mentor-principals' lifelong learning needs in the preparation of principal candidates. In G. Perreault & F. C. Lunenburg (Eds.), The changing world of school administration (pp. 325-346). Lanham, MD: Scarecrow Press.

Kaplan, L. S., Owings, W. A., & Nunnery, J. (2005). Principal quality: A Virginia study connecting interstate school leaders licensure consortium standards with student achievement. NASSP Bulletin, 89(643), 28-44.

Lashway, L. (2003). Improving principal evaluation. Eugene, OR: Clearinghouse on Educational Management. (ERIC Document Reproduction Service No. ED 482347)

Marsh, H. W., Balla, J. R., & Hau, K. (1996). An evaluation of incremental fit indexes: A clarification of mathematical and empirical properties. In G. A. Marcoulides & R. E. Schumacker (Eds.), Advanced structural equation modeling: Issues and techniques (pp. 315-353). Mahwah, NJ: Lawrence Erlbaum Associates.

McCollum, D. L., Kajs, L. T., & Minter, N. (2006). School Administrator's Efficacy: A model and measure. Education Leadership Review, 7(1), 42-48.

Murphy, J. (2005). Unpacking the foundations of ISLLC standards and addressing concerns in the academic community. Educational Administration Quarterly, 41(1), 154-191.

NPBEA (National Policy Board for Educational Administration). (2002-a). Educational leadership constituent council (ELCC): Instructions to implement standards for advanced programs in educational leadership. Fairfax, VA: National Policy Board for Educational Administration. Retrieved February 17, 2006, from Instructions to Standards for preparing ELCC Program Report at

NPBEA (National Policy Board for Educational Administration). (2002-b). Educational leadership constituent council (ELCC): New ELCC Standards--NCATE approved. Fairfax, VA: National Policy Board for Educational Administration. Retrieved February 17, 2006 at

Ormrod, J. E. (2003). Educational psychology: Developing learners. Upper Saddle River, NJ: Prentice-Hall.

Schultz, D., & Schultz, S. E. (1998). Psychology & work today: An introduction to industrial and organization psychology (7th ed.). Upper Saddle River, NJ: Prentice Hall.

Smith, W., Guarino, A., Strom, P., & Adams, O. (2006). Effective teaching and learning environments and principal self-efficacy. Journal of Research for Educational Leaders, 3(2), 4-23.

Tschannen-Moran, M., Woolfolk-Hoy, A., & Hoy, W. K. (1998). Teacher efficacy: Its meaning and measure. Review of Educational Research, 68(2), 202-248.

Tschannen-Moran, M., & Gareis, C. (2004). Principals' sense of efficacy: Assessing a promising construct. Journal of Educational Administration, 42, 573-585.

Daniel L. McCollum, University of Houston-Clear Lake

Lawrence T. Kajs, University of Houston-Clear Lake

Norma Minter, University of Houston-Clear Lake

Factor 1: Instructional Leadership and Staff Development.
12. I am confident in my understanding of the total instruction
 program in my school.
13. I am able to understand the process of curriculum design,
 implementation and evaluation.
14. I am confident in my knowledge of best-practice research
 related to instructional practices.
15. I am able to develop a systematic process for mentoring
 teachers on my campus.
16. I am confident that I understand and can communicate to
 staff the complex instructional and motivational issues
 that are presented by a diverse student population.
17. I am confident in my skills to lead staff to understand and
 respect the diversity of our student population.
18. I am confident that I can lead staff to appreciate the
 kinds of knowledge and skills students and their families
 can add to the learning process.
19. I understand the development of a professional growth plan.
20. I have a clear sense of my own personal development needs
 and the resources I can access to address those needs.
21. I am confident in my skills to assess the staff development
 needs of a school.
22. I am confident that I possess the skills needed to
 implement the effective use of resources so that priority
 is given to supporting student learning.
23. I am confident in my skills to engage staff in the
 development of effective campus improvement plans that
 result in improved learning.

Factor 2: School Climate Development
5. I have the ability to engage students in the assessment of
 our school climate.
6. I have the ability to assess school climate using multiple
7. I have the ability to engage staff in the assessment of our
 school climate.
8. I have the ability to engage parents in the assessment of
 our school climate.
9. I am confident that I know how to use data about our school
 climate to improve the school culture in ways that
 promote staff and student morale.
10. I am confident that I know how to use data about our school
 climate to encourage appropriate student behavior.
11. I am confident that I know how to use data about our school
 climate to support a positive learning environment.

Factor 3: Community Collaboration
35. I understand community relations models that are needed to
 create partnerships with business community, and
 institutions of higher education.
36. I am confident in my ability to use marketing strategies
 and processes to create partnerships with business,
 community, and institutions of higher education.
37. I can identify and describe the services of community
 agencies that provide resources for the families of
 children in my school.
38. I am confident in my skills to involve families and
 community stakeholders in the decision-making process at
 our school.
49. I am confident I can resolve issues related to budgeting.
50. I am able to supplement school resources by attaining
 resources from the community.
51. I am confident I can solicit community resources to resolve
 school issues.

Factor 4: Data-based Decision Making Aligned with Legal and Ethical
39. I can make sound decisions and am able to explain them
 based on professional, ethical and legal principles.
40. I am confident in my ability to understand and evaluate
 education research that is related to programs and issues
 in my school.
41. I am confident in my ability to apply appropriate research
 methods in the school context.
42. I can explain to staff and parents the decision-making
 process of my school district.
43. I can explain to staff and parents how the governance
 process of my school is related to state and national
 institutions and politics.
44. I am confident in my ability to examine student performance
 data to extract the information necessary for campus
 improvement planning.
46. I can make decisions within the boundaries of ethical and
 legal principles.
47. I am able to explain the role of law and politics in
 shaping the school community.

Factor 5: Resource and Facility Management
29. I am confident in my knowledge of legal principles that
 promote educational equity.
30. I am able to provide safe facilities (building, playground)
 according to legal principles.
31. In accordance with legal principles, I am confident I can
 find information to address problems with facilities.
32. I am able to find the appropriate personnel to resolve
 facility-related problems.
33. I am confident in my ability to identify additional
 resources to assist all of the individuals in my school.

Factor 6: Use of Community Resources
25. I am confident I could use community resources to support
 student achievement.
26. I am confident I could use community resources to solve
 school problems.
27. I am confident I could use community resources to achieve
 school goals

Factor 7: Communication in a Diverse Environment
24. I am confident in my skills to interact positively with the
 different groups that make up my school community.
34. I am confident in my ability to lead my staff in involving
 families in the education of their children.
45. I am confident in my communication abilities to lead in a
 variety of educational settings.

Factor 8: Development of School Vision
1. I am confident that I possess the skills to lead a school
 community in the development of a clear vision.
2. I can develop a vision that will help ensure the success of
 all students.
3. I am able to use strategic planning processes to develop
 the vision of the school.
4. I am confident that I can establish two-way communication
 with stakeholders (staff, parents, students, community)
 in order to obtain the commitment necessary for
 implementing the vision for our school.

Table 1. Seven ELCC Standards

Standard 1 Candidates who complete the program are educational
 leaders who have the knowledge and ability to promote the
 success of all students by facilitating the development,
 articulation, implementation, and stewardship of a school
 or district vision of learning supported by the school

Standard 2 Candidates who complete the program are educational
 leaders who have the knowledge and ability to promote the
 success of all students by promoting a positive school
 culture, providing an effective instructional program,
 applying best practice to student learning, and designing
 comprehensive professional growth plans for staff.

Standard 3 Candidates who complete the program are educational
 leaders who have the knowledge and ability to promote the
 success of all students by managing the organization,
 operations, and resources in a way that promotes a safe,
 efficient, and effective learning environment.

Standard 4 Candidates who complete the program are educational
 leaders who have the knowledge and ability to promote the
 success of all students by collaborating with families
 and other community members, responding to diverse
 community interests and needs, and mobilizing community

Standard 5 Candidates who complete the program are educational
 leaders who have the knowledge and ability to promote the
 success of all students by acting with integrity, fairly,
 and in an ethical manner.

Standard 6 Candidates who complete the program are educational
 leaders who have the knowledge and ability to promote the
 success of all students by understanding, responding to,
 and influencing the larger political, social, economic,
 legal, and cultural context.

Standard 7 The internship provides significant opportunities for
 candidates to synthesize and apply the knowledge and
 practice and develop the skills identified in Standards
 1-6 through substantial, sustained, standards-based work
 in real settings, planned and guided cooperatively by the
 institution and school district personnel for graduate

Table 2
Path Coefficients, Error Variances, R-squared, Means, and Standard

 Path Error
Item Coefficients Variance R-squared M SD

1 .87 .49 .76 6.06 .95
2 .89 .46 .79 6.08 .89
3 .82 .57 .68 5.99 .99
4 .68 .73 .47 6.31 .83
5 .70 .72 .49 6.23 .82
6 .82 .57 .67 6.04 .99
7 .76 .63 .60 6.10 .91
8 .76 .65 .58 5.96 .97
9 .88 .48 .77 6.01 .03
10 .88 .48 .77 5.99 .97
11 .86 .50 .75 6.09 .94
12 .70 .72 .48 5.77 1.07
13 .70 .72 .49 5.99 1.02
14 .75 .66 .56 5.64 1.06
15 .75 .67 .56 6.07 2.67
16 .83 .56 .69 5.86 1.05
17 .81 .58 .66 6.16 .97
18 .81 .59 .65 6.09 .93
19 .68 .73 .46 5.86 1.23
20 .66 .75 .43 6.30 .83
21 .83 .55 .70 6.01 .95
22 .84 .54 .70 6.09 .91
23 .80 .59 .65 5.97 .98
24 .82 .57 .67 6.28 .85
25 .79 .61 .63 6.05 .99
26 .94 .35 .88 5.72 1.09
27 .98 .21 .96 5.92 2.70
28 .59 .81 .34 6.67 .63
29 .79 .61 .63 5.74 1.22
30 .82 .57 .67 6.15 1.00
31 .87 .49 .76 6.08 1.03
32 .80 .60 .65 6.15 .96
33 .80 .64 .59 5.94 .97
34 .86 .58 .67 6.14 .89
35 .83 .56 .68 5.77 1.14
36 .78 .62 .61 5.57 1.17
37 .82 .67 .68 5.61 1.27
38 .85 .53 .72 5.87 1.07
39 .75 .66 .56 6.16 .97
40 .80 .60 .64 6.00 1.00
41 .76 .66 .57 5.86 1.03
42 .84 .55 .70 5.72 1.18
43 .83 .56 .69 5.56 1.21
44 .71 .71 50 6.13 .93
45 .76 .65 .58 6.29 .86
46 .66 .75 .43 6.32 .92
47 .81 .59 .66 5.64 1.22
48 .62 .78 .39 6.66 .59
49 .60 .80 .36 5.74 1.18
50 .88 .47 .76 5.79 1.09
51 .86 .51 .74 5.77 1.07

Table 3
Means, Standard Deviations, Correlations, (and Reliabilities) of

Subscale M SD 1 2 3

1. ILSD 5.98 .80 (.94)
2. SCD 6.06 .80 .74 (.93)
3. CC 5.73 .95 .69 .62 (.85)
4. DBDM 5.92 .86 .72 .65 .73
5. RFM 6.01 .88 .68 .61 .71
6. UCR 6.20 .82 .55 .48 .69
7. CDE 6.23 .74 .74 .67 .73
8. DSV 6.11 .80 .69 .74 .57

Subscale 4 5 6 7 8

2. SCD
3. CC
4. DBDM (.92)
5. RFM .78 (.90)
6. UCR .56 .56 (.86)
7. CDE .68 .67 .66 (.81)
8. DSV .56 .51 .48 .66 (.90)

Note. ILSD = Instructional leadership and staff development,
SCD = School climate development, CC = Community collaboration,
DBDM = Data-based decision making aligned with legal and ethical
principles, RFM = Resource and facility management, UCR = Use of
community resources, CDE = Communication in a diverse environment,
DSV = Development of school vision
COPYRIGHT 2006 The DreamCatchers Group, LLC
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:McCollum, Daniel L.; Kajs, Lawrence T.; Minter, Norma
Publication:Academy of Educational Leadership Journal
Geographic Code:1USA
Date:Sep 1, 2006
Previous Article:Active learning: an empirical study of the use of simulation games in the introductory financial accounting class.
Next Article:The role of gender in teaching effectiveness ratings of faculty.

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters