Printer Friendly

Measuring Classroom Climate: A Validation Study of the My Child's Classroom Inventory-Short Form for Parents.

For decades now, increased interest in the topic of school climate has challenged educators to consider ways to gather pertinent information and make necessary changes to better meet the needs of students (Thapa, Cohen, Guffey, & Higgins-D'Alessandro, 2013). School climate refers to both the physical environment of a school (school grounds, classrooms, safety supports in place) and the interactions among its members (students; teachers and students; teachers and parents; and teachers, administrators, school counselors, and other staff). Thapa, Cohen, Guffey, and Higgins-D'Alessandro (2013) defined climate in terms of four main areas: safety, relationships, teaching and learning, and the institutional environment. Likewise, classroom climate defines similar facets related to the space, structure, interplay, and events that occur within a particular classroom environment (Adelman & Taylor, 2002). Research has associated positive school/classroom settings with more favorable academic and social/emotional outcomes for students. For example, evidence suggests that affirmative learning environments promote healthy relationships, a sense of connectedness, increased learning gains, fewer disciplinary problems, and higher promotion/graduation rates, all of which are outcomes of great importance to school counselors (Centers for Disease Control and Prevention [CDC], 2009; Gottfredson, Gottfredson, Payne, & Gottfredson, 2004; Reyes, Brackett, Rivers, White, & Salovey, 2012; Thapa et al., 2013; Thomas, Bierman, & Powers, 2011).

In a report on accountability conducted by the Learning Policy Institute, Melnick, Cook-Harvey, and Darling-Hammond (2017) offered the common phrase used by researchers: "What gets measured gets done" (p. 10). Following this rationale, the only sure way to accurately address the climate needs in a school is to first assess and analyze its current status and dynamics. The CDC (2009) strongly recommended that school climate reform be based on data-driven decision-making, whereby educators measure multiple levels of change and gather perceptions to gain a comprehensive perspective on the operational functioning of the school milieu. Moreover, the Every Student Succeeds Act (U.S. Department of Education [U.S. DOE], 2018) requires that school leaders include indicators of school quality and student success within their accountability reports. The U.S. DOE (2007) has even provided funding through several Safe and Supportive Schools grants to extend what is presently known about measurement, evaluation, and research in this area; however, valid and reliable tools to assess school/classroom climate are scarce, which further complicates the issue.

If school counselors are to target areas for improvement, select interventions, programming, and curricula that best fit with their stakeholders needs; and work with school staff, parents, and community leaders to tackle the problems evident, they must be confident that they can rely on the gathered data. Collecting information from multiple sources or levels of influence (systems) can also reveal areas of alignment or disagreement (Lee & Shute, 2010). Although some instruments were developed to assist in this process, the majority have not stood the test of time nor have they been rigorously investigated for their psychometric properties (La Paro, Pianta, & Stuhlman, 2004). This article provides support for a reliable and valid instrument for gauging parent perceptions of their child's classroom climate. This psychometric study builds on the research of several previous studies (Mariani, Villares, Sink, Colvin, & Perhay Kuba, 2015; Villares, Mariani, Sink, & Colvin, 2016) that validated two other classroom climate measures, one for students and one for teachers, that school counselors can easily administer as a part of ongoing program evaluation. We discuss findings from an exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) that support the reliability and structural validity of the instrument, the My Child's Class Inventory-Short Form (MCCI-SF; Sink, 2017), and present practical applications and implications for school counselors.

School Counselor Accountability Within a Systems Context

Providing a comprehensive, developmental guidance program based on data and incorporating evidence-based interventions are strategies that leaders in the school counseling field have long encouraged (Astramovich, Hoskins, & Coker, 2013; Bemak, Williams, & Chung, 2014; Carey & Dimmitt, 2008; Dahir & Stone, 2003; Dimmitt, 2009; Dimmitt, Carey, & Hatch, 2007; Kaffenberger & Young, 2013; Maras, Coleman, Gysbers, Herman, & Stanley, 2013; Poynton & Carey, 2006; Sink, 2009; Young & Kaffenberger, 2011). These evidence-based interventions are situated within the various contexts or layers of student life, including students' peer groups, learning environments (e.g., classroom and the larger school milieu), teachers, families, and so on. Thus, as school counselor researchers have suggested, evaluating the effectiveness of school counselors' evidence-based interventions on various student outcomes and on the learning environment itself must take into account systems or ecological frameworks (e.g., multitiered system of supports [MTSS]; McMahon, Mason, Daluga-Guenther, & Ruiz, 2013; Sink, 2019).

The ASCA National Model (American School Counselor Association [ASCA], 2012) is also clearly focused on direct services for students, provided through both preventative programming and responsive interventions. Working collaboratively with parents and teachers furthers the school counselor's ability to reach more students, address the highest needs, and do so in a meaningful and targeted manner (Brigman, Mullis, Webb, & White, 2008; Sink, 2008; Warren & Gerler, 2013). To most effectively execute the ASCA National Model and state- or district-developed comprehensive school counseling programs, counselors not only must have access to and adequately utilize data already routinely collected but they also must intentionally gather additional measures to round out their understanding. Student, class, grade-, and school-level data (e.g., grades, attendance, test scores, and disciplinary infractions) can undoubtedly inform how a school counselor plans and delivers their systemic programming and services. However, additional data, including the use of surveys/questionnaires (perceptual data) from various stakeholder groups (students, parents, teachers, administrators, community leaders), can add another layer of information and further promote understanding by providing more context to outcome data. Specifically, the Recognized ASCA Model Program designation, the highest recognition for quality school counseling programs, requires counselors to gather and report results from various sources of perception and outcome-based data. Therefore, researchers and leaders in the field strongly encourage practitioners to gather multiple sources of data when assessing, planning for, and evaluating the effectiveness of their comprehensive school counseling programs (Carey & Dimmitt, 2008; Mariani, Webb, Villares, & Brigman, 2015; Pellegrini & Bartini, 2000; Van Schoiack-Edstrom, Frey, & Beland, 2002; Zyromski & Mariani, 2016).

Furthermore, the delivery of direct services by school counselors fits well into the MTSS framework, which follows a step-by-step system for documenting universal screenings for behavior and academic struggles, instituting scientifically supported Tier 1 programs to all students, targeting students who needed additional support with Tier 2 and Tier 3 interventions, and regularly monitoring their progress (McIntosh & Goodman, 2016). To follow this MTSS process, the school counselor relies on multiple types of data, which should include student outcome data and perception data from multiple stakeholders. A school counselor could help improve classroom climate using Tier 1 interventions, advocacy efforts, professional development for teachers, and by offering parent workshops. For example, if competitiveness is an issue, school counselors could investigate this further by surveying students and teachers and then delivering a classroom intervention that fosters a climate of caring and support by reinforcing group collaboration and open communication. The school counselor in this example could also advocate for changes in school practices (e.g., Honor Roll recognition) and provide a workshop for parents teaching them useful strategies to help their child(ren) manage the stress/anxiety they may be experiencing from being in a competitive classroom.

Stakeholder Support

The viability of a comprehensive program depends heavily on understanding and encouragement from key stakeholders: administrators, teachers, parents, and community partners. Parents are an essential stakeholder group; their support is critical to their own child's success and to the success of many other events and happenings that occur in a school. Lee and Shute's (2010) extensive literature review revealed two main categories of variables that can potentially impact academic achievement for K-12 students: personal factors (student engagement and learning strategies) and social-contextual factors (school climate and social-familial influences), both equally important in shaping a student's success. Specifically indicated among the many social-familial influences are parental variables, which speak to the critical role that parents play in students' lives in and outside of school. Hallinger, Bickman, and Davis (1996) indicated that parents' attitudes toward education and their philosophies on child-rearing could influence their schools' decisions on various programs and policies and ultimately affect their child's learning and attitudes. Furthermore, Albright, Weissberg, and Dusenbury (2011) reported that school and family partnerships are integral in creating "important opportunities for children to develop social, emotional, and academic competencies" (p. 1). When skill sets (academic or social/ emotional) taught in school are reinforced at home, learning is further enhanced (Albright & Weissberg, 2010). Schueler, Capotosto, Bahena, McIntyre, and Gehlbach (2014) indicated that parents' views about school could influence their children's attitudes about school, affect their levels of engagement, and even impact their residential and school enrollment decisions.

Measuring Elementary Classroom Climate

In the matter of school and classroom climate, the individual stakeholder's perceptions are quite important. Many researchers have attempted to triangulate the views of how students, parents, teachers, and other personnel feel about their school in order to improve climate and increase outcomes (Allen & Fraser, 2007; Bocchi, Dozza, Chianese, & Cavrini, 2014; Cavrini, Chianese, Bocch, & Dozza, 2015; Mitchell, Bradshaw, & Leaf, 2010). In addition to measuring multiple stakeholders' perceptions of school climate, focusing specifically on classroom-level variables (e.g., the relationship between student and teacher) is critical. Researchers have demonstrated that the classroom climate is a primary factor that impacts bullying and aggressive behavior (Leff et al., 2011; Thomas et al., 2011). Classroom factors also have more influence on students' perceptions of the school environment than whole school variables (Koth, Bradshaw, & Leaf, 2009). This is especially important in elementary schools where students spend most of their time with one teacher (Evans, 2014).

A study by Allen and Fraser (2007) was designed to measure multiple stakeholders' perceptions of classroom climate variables by assessing students' and parents' attitudes toward science and achievement in their science class. Their sample of 520 students from Grades 4 and 5 and 120 of their parents/ guardians in South Florida completed a shortened version (39 items) of the What Is Happening in this Class? (WIHIC) Questionnaire, a valid and reliable measure. The instrument looked at whether perceptions for each group differed across six areas: student cohesiveness, teacher support, involvement, task orientation, equity, and investigation. The main findings of the study showed that both parent and student perceptions about science class were related to students' attitudes toward science, but that parent perceptions were stronger predictors of scores/outcomes than students' perceptions. This study highlighted the value of assessing parental attitudes, thoughts, and feelings specifically as they relate to the classroom environment.

Undoubtedly, collecting and assessing the opinions of different stakeholder groups, including parents, can provide valuable information to school counselors about the present learning environment (Boccanfuso & Kuhfeld, 2011; National Center on Safe and Supportive School Environments, 2018). School counselors can use this information to correct issues within the learning environment and offer support to students in need and in program planning and future delivery of services. Unfortunately, there are few user-friendly and valid measures to assess parental/caregivers' perceptions of their children's elementary school classrooms in the United States. As mentioned above, the WIHIC (56- or 39-item versions) was developed outside the United States primarily to assess at least six dimensions of classroom climate. The WIHIC instrument, although potentially useful, was not specifically designed with the school counselor accountability role in mind. Moreover, some of its dimensions are not entirely relevant to evaluating a comprehensive school counseling program. Consequently, Sink and Spencer created and validated the My Class Inventory--Short Form Revised (MCI-SFR; 2005) and My Class Inventory--Short Form for Teachers (TMCI-SF; 2007) classroom climate questionnaires for students and teachers, respectively, in U.S. elementary schools to support school counselors' program evaluation practices. Recently, a valid and reliable parent edition, My Child's Classroom Inventory-Short Form (MCCI-SF; Sink, 2017), was designed to assist school counselors in further appraising their influence on learning environments following classroom interventions and guidance lessons.

Research Aim and Questions

The primary intent of this psychometric study was to validate the MCCI-SF instrument. The MCCI-SF is a user-friendly, 25-item questionnaire designed to assess parental/caregiver perceptions of their child's classroom environment across five dimensions (i.e., Satisfaction, Friction, Difficulty, Competitiveness, and Cohesion; see next section and Fraser, 2012, for theoretical background). To establish the psychometric properties of the MCCI-SF, we collected and analyzed data to determine the measure's reliability and validity. We proposed the following research questions: (a) What is the underlying factor structure or dimensionality of the MCCI-SF? (b) Is the overall scale internally consistent?

Method

Participants and Sampling

A convenience sample of parents/caregivers (N = 770) was purposely drawn from voluntary elementary schools including two in Florida (FL; n = 358; School A: 15.3%, n = 118; School B:31.2%, n = 240) and three in Minnesota (MN; n = 412; school C: 16.5%, n = 127; school D: 19.6%, n = 151; school E: 17.4%, n = 134). We did not collect socioeconomic data from respondents, but school district website data suggested that the five elementary schools largely served lower-middle- to upper-middle-class families.

We collected parental status and student grade-level data across the schools. Approximately 80% (77.1%, n = 594) of the respondents self-identified as mothers and 14.5% (n = 112) as fathers. Less than 1% (n = 7) of the sample was composed of other significant persons related to the child (e.g., grandparent or stepparent), with another 7.4% (n = 57) choosing not to specify their relationship. Furthermore, more than three fourths of the respondents had a child in a grade ranging from kindergarten to Grade 4 (78.6%, n = 605). Nearly 14% (13.8%, n = 106) of the parents represented students in either Grade 5 or 6. Five (0.6%) respondents had a child in a mixed-grade classroom. Seven percent (n = 54) of the sample did not report their child's grade level. A small percentage of elementary students participated in special programs: English Language Learners 4.4% (n = 34), Exceptional Student Education (ESE) 5.4% (n = 41), and Gifted 7.5% (n = 58).

Student age, gender, and ethnicity distributions were relatively similar across schools. The respondents' children ranged in age from 5 to 13 years (M = 8.00, SD = 1.85), with the preponderance (75.1%, n = 578) ranging from 6 to 10 years old. Approximately 7% (6.9%, n = 53) of the age data were missing. Five hundred eighteen (67.3%) children were identified by their parents as Caucasian or White and 61 (7.9%) as Hispanic or Latino/a. Other ethnic groups represented by the children were multiracial/multiethnic (7.5%, n = 58), African American or Black (4.4%, n = 34), Asian (1.7%, n = 13), Other (1.3%, n = 10), and not reported (9.9%, n = 76).

Parent respondent ethnicities were distributed as follows: Caucasian or White (71.4%, n = 550), Hispanic or Latino/a (8.4%, n = 65), African American or Black (4.4%, n = 34), multiracial/multiethnic (1.8%, n = 14), Other (0.9%, n = 7), and not reported (11%, n = 85). In terms of gender distributions for parent respondents, 77.3% (n = 595) were female, 14.7% (n = 113) male, and 0.9% (n = 7) did not report their gender. The student sample was composed of 37.7% (n = 290) females and 39.7% (n = 306) males. Fourteen (1.8%) individuals did not report their child's gender. These proportions were similar for each of the five participating schools.

Procedures

Following institutional review board approval, the research team contacted a school counselor and an administrator for each of the participating schools. The school personnel agreed to distribute the consent form and an electronic copy of the MCCI-SF to the parents/caregivers of all the students enrolled in Grades K-6. The electronic version of the MCCI-SF was created and distributed using SurveyMonkey (www.surveymonkey.com). Each school made computers available to parents who do not have access to a personal computer or mobile device. The school counselors also provided a paper copy of the instrument for anyone who preferred not to complete the measure electronically.

To ensure anonymity, the parents/caregivers were only asked to provide nonidentifying demographic information on the survey (e.g., the caregivers' ethnicity, gender, and relationship to child and their child's current grade level, placement in special programs, age, ethnicity, and gender). After providing the demographics, the parents/caregivers selected their child's grade level and teacher's name from a drop-down menu before proceeding to complete the 25 items on the MCCI-SF. Any data that were collected using paper and pencil format were entered into SurveyMonkey computer application by a member of the research team. Following a 3-week data collection period, we shared the school-level data with the school counselor to provide information about the parent/caregivers' perceptions of the classroom climate as part of the annual evaluation of each school's comprehensive school counseling program.

Instrumentation

The MCCI-SF was designed using a conceptual model and item content that framed the original student inventory (My Class Inventory; Fisher & Fraser, 1981; Fraser, Anderson, & Walberg, 1982; Fraser & O'Brien, 1985), its subsequent student version (MCI-SFR, Mariani, Villares, et al., 2015; Sink & Spencer, 2005), and the teacher revision (TMCI-SF; Sink & Spencer, 2007; Villares et al., 2016). Five characteristics or dimensions of classroom climate (i.e., Satisfaction, Cohesion, Friction, Difficulty, and Competitiveness) were proposed. As shown in Table 1, each dimension consisted of five statements. Respondents were asked to register their level of agreement with the statement, ranging from 1 (strongly disagree) to 5 (strongly agree).

Statistical Analyses

In accordance with the recommendations provided by leading psychometricians (Nunnally & Bernstein, 1994) and counseling scholars (Dimitrov, 2012; Mvududu & Sink, 2013), we first examined the dimensionality and factorial validity of the MCCI-SF Questionnaire using EFA. Subsequently, we tested the inventory's proposed factor structure using CFA.

Initial scale dimensionality and scoring. Although the MCCI-SF reflects the same conceptual framework and related item content of its precursors (MCI-SFR and TMCI-SF), the new measure is different enough (e.g., item wording revised to reflect parent voice) to warrant an EFA on the FL data set. Thus, to determine the underlying dimensionality of the initial version of the questionnaire, we computed a prerotation exploratory factor analysis (PFA) on the FL sample. Based on previous MCI-SFR (Mariani, Villares, et al., 2015; Sink & Spencer, 2005) research showing that the factors were at least low to moderately correlated, the initial factor matrix was rotated using an oblique method (direct oblimin, [DELTA] = 0). The criteria for factor extraction and subsequent rotation were eigen-values ([lambda]) greater than 1, percentage of variance accounted for by each derived factor ([greater than or equal to] 5%), magnitude of item commonalities ([greater than or equal to] .40), and review of scree plot and parallel analysis results. Respondent factor scores were calculated by summing the raw scores for those items comprising each retained factor. Given the exploratory nature of this analysis, .30 was set as the threshold to mark a factor.

CFA. We deployed a CFA using a type of structural equation modeling to test the validity of the factor structure found with FL respondents with another large group of elementary school parents from MN. Specifically, we conducted a second-order CFA (maximum likelihood estimation used with missing data) using IBM Amos (Version 24; Byrne, 2016) on the MN respondents. To examine model fit between population covariance [SIGMA] and sample covariance matrix S (covariance structure analysis; Brown, 2015; Schumacker & Lomax, 2016), we used several fit indices. Even though guidelines for acceptable fit indices vary depending on the source and sample size, some consensus is available (Kline, 2016; Marsh, Hau, & Wen, 2004; Tabachnick & Fidell, 2013; Weston & Gore, 2006). To determine whether potential differences between observed and implied variance-covariance matrices exist, we first considered the [chi square] test (CMIN in Amos). Preferably, a nonsignificant [chi square] should be found, suggesting that data set and proposed conceptual model are comparable. Since a nonsignificant result is uncommon with large sample sizes and skewed data, alternative fit indices are reported: relative or normed [chi square]/df (Amos CMIN/DF; ratio >2.00 represents a good fit); root mean square error of approximation (RMSEA; <.06 or lower suggests sample data fits the hypothesized model; Tabachnick & Fidell, 2013); comparative fit index (CFI; ranges from 0 to 1.0, with values closer to 1.0 indicating better fit; CFI >.95 suggests strong fit). Of these, RMSEA and CFI are less sensitive to sample size. Given that this study attempted to cross-validate the dimensionality of MCCI-SF, there was little theoretical justification to correlate error terms, and because the practice is not fully accepted by statisticians (e.g., Tabachnick & Fidell, 2013) we did not deploy this procedure.

Reliability. To estimate the internal consistency (i.e., interrelatedness of a sample of questionnaire items) of the derived factors, we calculated Cronbach's [alpha] coefficients. Reports differ about the acceptable values of [alpha], ranging from .70 to .95 (Tavakol & Dennick, 2011). For attitudinal surveys such as the measure under study, [alpha]s in .70-.85 range are expected.

Results

Data Preparation

We first examined data for problematic issues including miscoding, irregular response patterns, data entry errors, and missing information. Across the two data sets, approximately 15% of the item data were absent with no discernible pattern in missing data. Therefore, we treated the omitted data as missing at random. Cases with more than 5% of data missing were removed, leaving a total usable sample size of 657. We recoded negatively worded items into the positive direction, allowing for consistent and administrator-friendly scoring and interpretation.

Descriptive Statistics and Assumption Checking

Descriptive statistics for all respondents across the initial 25 items are reported in Table 1. The majority of item means ranged from 3.00 to 4.10. Item SDs were similar, generally less than 1.00. We deployed the Kolmogorov-Smirnov test with the Lilliefors significance correction (Schumacker & Lomax, 2016) to test the assumption of multivariate normality of the FL and MN data sets. As expected with moderate to large data sets (Field, 2018), the item distributions were generally non-normal (p < .05). However, visual inspection of item quartile-quartile (Q-Q) plots and histograms indicated that most of the item distributions approximated normality with a moderate level of skew and some limited kurtosis. Item 1 generated excessive skew and kurtosis and thus was removed from subsequent analyses. Three additional items had kurtosis values ranging from 3.15 to 3.53. Finally, interitem correlations revealed largely low to moderate correlations (see Table 2).

PFA and Reliability Analysis (FL Data Set)

We computed a PFA on the FL data set (n = 318) on 24 items, generating acceptable indices for the factorability of the inter-correlation matrix (Kaiser-Meyer-Olkin estimate of sampling adequacy = .84; Bartlett's Test of Sphericity, df = 276, [chi square] = 3,139.83,p < .001). Initial communalities ([h.sup.2]) ranged from .19 (item 19) to .64 (item 20), with the majority greater than .35. Based on the initial total percentage explained variance (45.52%), four-factor eigenvalues greater than 1.00, and analyses of scree and parallel plots, four factors consisting of 24 items were retained for oblique rotation.

The 24-item rotated factor pattern matrix revealed substantial cross loadings for Items 7, 12, and 14. After removing these items, simple structure was achieved, revealing a robust and interpretable 21-item, four-factor pattern (see Table 3). Most loadings were well above .40. Factor 1 was composed of 7 items (2, 4, 10, 15, 20, 22, 25) initially thought to be aligned with two separate factors (Friction and Cohesion). Given the content of these items, this factor was referred to as Peer Relations. Six items (3, 8, 13, 17, 18, 23) marked Factor 2 (Competitiveness of the classroom) and 4 items (4, 9, 19, 24) constituted Factor 3 (Difficulty related to the coursework). The content of the 4 items (6, 11, 16, 21) comprising Factor 4, Satisfaction, addressed parent's satisfaction with the classroom learning experience. Factor intercorrelations ranged from .08 (Factor 2 with 3) to -.62 (Factor 1 with 4). Higher summed scores for Peer Relations and Satisfaction and lower scores for Competitiveness and Difficulty dimensions are preferable. Finally, Cronbach's [alpha]s computed suggested that the factors were reliable, ranging from .71 (Difficulty, Factor 3) to .88 (Satisfaction, Factor 4).

CFA and Reliability Analysis (MN Data Set)

Assumptions and hypothesized model. Using the MN data set (n = 412) as a cross-validation sample, we first examined the assumptions underlying CFA using IBM SPSS (Version 24). After conducting a missing data analysis, discussed above, we reduced the sample size to 339, generating a ratio of cases to measured or observed variables (21) of approximately 16 to 1. Issues related to item normality, outliers, and multicollinearity were discussed above. The hypothesized four-factor model and derived standardized coefficients are shown in Figure 1. The unobserved latent factors are depicted as large ovals (i.e., Peer Relations, Competitiveness, Difficulty, and Satisfaction factors). The observed variables (21 questionnaire items) are represented by rectangular boxes, and item error terms are depicted by small ovals (i.e., err). The factors are assumed to covary with each other.

Model estimation. Not unexpectedly, the independence model that hypothesizes that all variables are noncorrelated was rejected, [chi square] (df = 231, 3,338.64, p < .001). We then examined hypothesized four-factor model. The [chi square]/df ratio (CMIN/df 491.22/183) = 2.68 (p <.001) was slightly above the 2.00 threshold for a good fit. The [chi square] difference test result indicated clearly that the hypothesized model was a significant improvement over the independence model. The CFI was .90, below the .95 cutoff for a strong fit. The RMSEA index (.06; CI[.05, - .07]) suggested a relatively good fit between the data and hypothesized model. Figure 1 shows the statistically significant (p <.001) model coefficients in their standardized form. The dimensions of Peer Relations and Satisfaction with the classroom learning experience were strongly correlated (r = .70), with low to low-moderate intercorrelations among the other factors.

Reliability estimation. For comparison purposes, we computed Cronbach's [alpha]s coefficients across dimensions with the MN data set (n = 337). The findings suggest that first and fourth factors were the most homogeneous (Peer Relations, Factor 1 = .86; Competitiveness, Factor 2 = .75; Difficulty, Factor 3 = .70; and Satisfaction, Factor 4 = .91.)

Discussion

The aim of the current study was to create and establish the validity and reliability of the MCCI-SF, a parent classroom climate inventory that school counselors can use to gather important perception data about their comprehensive program. To accomplish this end, we first sought to determine the underlying dimensionality of the MCCI-SF, a new parent measure of classroom climate complementary to the previously validated student (MCI-SRF) and teacher (TMCI-SF) versions. Second, the study attempted to cross-validate the derived factor structure with another sample of parent respondents. We also examined factor reliabilities on both samples. Below we discuss the evidence for the factorial validity and reliability of the new inventory and provide implications for school counseling accountability practice.

Factorial Validity

Initial findings provided evidence for the proposed dimensionality of the MCCI-SF with the FL parent/caregiver sample. Four of five conceptual factors were derived. Given the item content, two proposed dimensions thought to represent classroom climate (Friction and Cohesion) actually merged into a single factor entitled Peer Relations. Simple structure was achieved using these factors: Peer Relations (7 items), Competitiveness in the classroom (7 items), Difficulty of the coursework (4 items), and Satisfaction with the classroom learning experience (4 items). The findings largely replicated the factor structure of the TMCI-SF (Sink & Spencer, 2007; Villares et al., 2016). In particular, similar to teacher perceptions, parent perceptions of their child's (peer) relations in classrooms were viewed as a single dimension, rather than two separate factors of Cohesion and Friction.

The next step in the development of the MCCI-SF was to cross-validate the derived four-factor model using another sample of parent respondents. We computed a single-level CFA four-factor model (see Figure 1) with data gathered from a large group of MN parent/caregivers. Based on salient fit indices (e.g., CFI = .90; RMSEA = .06, CI [.05, -.07]), the CFA results indicate that the fit between the initial (hypothesized) four-factor solution and the MN parent data set was acceptable as a new measure (Tabachnick & Fidell, 2013). The findings were also largely similar to the final CFA path diagram of the teacher version (TMCI-SF; Villares et al., 2016). In short, psychometric research on the MCCI-SF and TMCI-SF generated similar dimensionality with different adult respondent groups, where these factors were confirmed in each inventory: Satisfaction, Competitiveness, Difficulty, and Peer Relations.

The magnitude and directionality of the interfactor correlations generated from the parent (MCCI-SF) and teacher (TMCI-SF; Villares et al., 2016) CFAs were not entirely consistent. Specifically, unlike the TMCI-SF results (Villares et al., 2016), where Peer Relations was negatively correlated with Difficulty (-.44) and Competitiveness (-.22), these correlations were positive and low (rs = .26 and .17, respectively) with the MN parent/caregiver data set. Moreover, whereas parents rated classroom competitiveness and coursework difficulty as low to moderately correlated with their perceptions of classroom satisfaction (rs = .24 and .38, respectively), teachers rated these factors as negatively associated with satisfaction (Villares et al., 2016). Interpreting these correlational differences between parent and teacher samples is somewhat speculative. However, it stands to reason that teachers are better able to recognize the nuances of their classrooms and the learning processes that occur there (Coleman, 2018). As such, elementary school teachers, unlike the children's parents, are more likely to understand that overly competitive learning environments coupled with highly challenging schoolwork are negatively associated with classroom satisfaction ratings (Brophy, 2017).

To provide further evidence for the importance of teachers and school counselors to nurture the social/emotional learning of children, the current study yielded a strong positive correlation (.70) between classroom satisfaction and peer relationships, replicating the magnitude of this correlation (r = .71) found in the earlier TMCI-SF study (Villares et al., 2016). In other words, this correlation strongly suggests that parents, like teachers, who rate their students' classrooms as possessing positive peer relations will most likely view the classroom environment as satisfying. This result parallels the considerable evidence supporting the moderate to strong links between positive student behaviors (e.g., achievement motivation, social/ emotional development) and teacher-facilitated healthy and caring learning environments (Baker, 2006; Furrer, Skinner, & Pitzer, 2014). Research on parent, teacher, peer, and student dynamics further indicates that positive interrelationships among these different systems and contexts within the school setting (Lee & Shute, 2010) are vital to establishing supportive learning environments (Coleman, 2018). Overall, the results provided sufficient evidence for the factorial validity of the MCCI-SF.

Factor Reliability

To further establish a measure's construct validity, researchers should compute evidence of homogeneity within the inventory's items comprising each factor (Streiner, 2003). Based on the guidelines for judging the suitability of coefficient [alpha]s (Ponterotto & Ruckdeschel, 2007), we found the four derived factors of MCCI-SF to be at least adequately reliable for the FL and MN parent samples, with as ranging in the low .70 s for Competitiveness and Difficulty factors, approximately .90 for Satisfaction, and in the mid-.80s for Peer Relations. The magnitude of these reliability coefficients generally matches those found in a previous study using the TMCI-SF (Satisfaction, [alpha] = .82; Peer Relations, [alpha] =.87; Difficulty, [alpha] = .72; Competitiveness, [alpha] = .48; Villares et al., 2016) and reflects commonly reported as in other psychometric studies of classroom climate questionnaires (see Fraser, 2012, for review).

In summary, the currrent study was able to substantiate the psychometric properties of the MCCI-SF, a new measure of elementary classroom climate. The measure's dimensionality was determined through a series of EFAs and CFAs with two parent samples. Subsequently, using the coefficient [alpha], we also established the homogeneity (internal consistency) of the inventory items with both parent samples.

Implications for School Counseling Practice

According to the ASCA National Model (ASCA, 2012), accountability and program evaluation are two essential functions of school counselors. As professionals, they are to appraise their impact on student learning, which includes students' social/emotional development and prosocial skills--the building blocks of academic performance. Learning, as suggested earlier, occurs in a larger context, starting in the school environment and then, more specifically, in the classroom. As one of the social/emotional learning experts in the school, the counselor can provide preventative programming and targeted services aimed at improving the overall school climate (Cleveland & Sink, 2018). Thus, appraising the quality of the learning environment from a variety of stakeholder perspectives (student, teacher, and parent) is incumbent upon school counselors. To accomplish this task, elementary school counselors can use a number of well-established classroom climate tools (Fraser, 2012). Although several measures exist that students can complete, no psychometrically sound measure was available to assess parent perceptions of classroom climate within a school counseling context. Based on the results of this study, counselors now possess such a questionnaire, which was designed to be user-friendly and provide accurate perceptional data. In its final version, the MCCI-SF has 21 items representing four critical dimensions of classroom climate (Peer Relations, Competitiveness in the classroom, Difficulty of the coursework, and Satisfaction with the classroom learning experience). The inventory takes approximately 5-8 min to complete and 5 min to score. It can be administered electronically or in a paper format.

Research has shown that school and classroom climate data should be collected from multiple sources to most effectively improve perceptions and prevent problems (Ramsey, Spira, Parisi, & Rebok, 2016). With the development of the MCCI-SF, elementary school counselors now have a full complement of classroom climate tools to assess classroom dynamics and work toward improving school climate and the school improvement process as a whole (Cleveland & Sink, 2018). Drawing from and analyzing data collected from the parent, student (MCI-SFR), and teacher (TMCI-SF) inventories, school counselors will have a broader view of the needs within each classroom. These measures in combination will provide trends in perceptional data enabling counselors to develop more targeted interventions within their school counseling programs and to assist with multitiered systems of support. Without data from multiple sources assessing the classroom environment, school counselors are likely to have less administrative support for spending 80% of their time delivering direct services and 20% indirect services to all relevant constituents, as recommended by ASCA (2012). Therefore, by utilizing multisystemic data, school counselors are better positioned to advocate for increased support and resources because as mentioned earlier, "what gets measured gets done" (Melnick, Cook-Harvey, & Darling-Hammond, 2017, p. 10).

Limitations and Suggestions for Future Research

Several research caveats should be taken into consideration when interpreting the results of this study. First, the MCCI-SF is a self-report measure, which could lead to a social desirability bias. Second, because we used convenience sampling in both states and the majority of the respondents were female and self-identified as mothers, the participant pool was not representative of all parents/caregivers. Relatedly, the study was conducted in two separate regions of the United States, and generalizability of the findings to schools in other regions or with different demographics may be problematic. Finally, missing data, while relatively minimal and probably occurring at random, may have reduced the accuracy and applicability of the inferential statistics.

The findings of this investigation, although promising, need to be cross-validated with parent/caregiver samples in other regions of the United States, including the Western and Northeastern states. Larger and more diverse samples are needed for factorial invariance testing, particularly for gender and ethnicity variables. Subsequent studies should examine the stability of parent/caregiver ratings over time (test-retest reliability) and the MCCI-SF's discriminant and convergent validity. For example, investigators should collect student (MCI-SFR), teacher (TMCI-SF), and parent (MCCI-SF) classroom climate perceptional data from a diverse set of individual elementary schools to determine whether the instruments' factors intercorrelate. Finally, qualitative research with elementary school counselors is also needed to better understand how they might use the parent inventory within their accountability practices.

Summary and Conclusion

Recognizing the multisystemic nature of schooling and the diverse and complex influences on student learning and development, school counselors within their program evaluation role should periodically appraise the school and classroom climates that they serve (ASCA, 2012; Astramovich et al., 2013; Cleveland & Sink, 2018; Erford, 2015). School counselors can collect classroom data using a variety of sources including observational analysis, interviews of pertinent individuals, and the use of psychometrically sound perceptional measures. Regarding the third data source, best accountability practices should attempt to understand the perspectives of those stakeholders who are most likely to affect the health of the classroom environment, including students, teachers, and parents (e.g., Bocchi et al., 2014). Thus, to fill this gap in measurement tools, the current study provided evidence for the reliability and factorial validity of MCCI-SF. This measure is designed to complement student and teacher classroom climate versions. The MCCI-SF is useful as an accountability tool for school counselors; it is brief, user-friendly (easy to administer and score), psychometrically sound, and can be given to diverse samples of parent respondents.

DOI: 10.1177/2156759X19860132

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding

The author(s) received no financial support for the research, authorship, and/or publication of this article.

References

Adelman, H. S., & Taylor, L. (2002). Classroom climate. In S. W. Lee, P. A. Lowe, & E. Robinson (Eds.), Encyclopedia of school psychology (pp. 304-312). Thousand Oaks, CA: Sage.

Albright, M. I., & Weissberg, R. P. (2010). School-family partnerships to promote social and emotional learning. In S. L. Christenson & A. L. Reschly (Eds.), Handbook of school-family partnerships (pp. 246-265). New York, NY: Routledge.

Albright, M. I., Weissberg, R. P., & Dusenbury, L. A. (2011). School-family partnership strategies to enhance children's social, emotional, and academic growth. Newton, MA: National Center for Mental Health Promotion and Youth Violence Prevention, Education Development Center.

Allen, D., & Fraser, B. J. (2007). Parent and student perceptions of classroom learning environment and its association with student outcomes. Learning Environments Research, 10, 67-82. doi: 10.1007/s10984-007-9018-z

American School Counselor Association. (2012). ASCA National Model: A framework for school counseling programs (3rd ed.). Alexandria, VA: Author.

Astramovich, R. L., Hoskins, W. J., & Coker, J. K. (2013). Organizing and evaluating data-driven school counseling programs (2nd ed.). Dubuque, IA: Kendall Hunt.

Baker, J. A. (2006). Contributions of teacher-child relationships to positive school adjustment during elementary school. Journal of School Psychology, 44, 211-229. doi: 10.1016/j.jsp.2006.02.002

Bemak, F., Williams, J. M., & Chung, R. C. (2014). Four critical domains of accountability for school counselors. Professional School Counseling, 18, 100-110. doi: 10.1177/2156759X0001800101

Boccanfuso, C., & Kuhfeld, M. (2011). Multiple repsonses, promising responses, promising results: Evidence-based, nonpunitive alternatives to zero tolerance. Child Trends. Publication #2011-09, 1-12.

Bocchi, B., Dozza, L., Chianese, G., & Cavrini, G. (2014). School climate: Comparison between parents' and teachers' perception. Procedia--Social and Behavioral Sciences, 116, 4643-4649. doi: 10.1016/j.sbspro.2014.01.1000

Brigman, G., Mullis, F., Webb, L., & White, J. (2008). School counselor consultation: Skills for working effectively with parents, teachers, and other school personnel (2nd ed.). Hoboken, NJ: John Wiley.

Brophy, J. E. (2017). Fostering student learning and motivation in the elementary school classroom. In S. G. Paris, G. M. Olson, & H. W. Stevenson (Eds.), Learning and motivation in the classroom (pp. 283-306). New York, NY: Routledge.

Brown, T. A. (2015). Confirmatory factor analysis for applied research (2nd ed.). New York, NY: Guilford Press.

Byrne, B. M. (2016). Structural equation modeling with Amos (3rd ed.). New York, NY: Routledge.

Carey, J., & Dimmitt, C. (2008). A model for evidence-based elementary school counseling: Using school data, research, and evaluation to enhance practice. The Elementary School Journal, 108, 422-430. doi: 10.1086/589471

Cavrini, G., Chianese, G., Bocch, B., & Dozza, L. (2015). School climate: Parents', students' and teachers' perceptions. Procedia Social and Behavioral Sciences, 191, 2044-2048. doi: 10.1016/j.sbspro.2015.04.641

Centers for Disease Control and Prevention. (2009). School connectedness: Strategies for increasing protective factors among youth. Retrieved from http://www.cdc.gov/HealthyYouth/AdolescentHealth/pdf/connectedness.pdf

Cleveland, R., & Sink, C. A. (2018). Student happiness, school climate, and school improvement plans: Implications for school counseling practice. Professional School Counseling, 21, 1-10. doi: 10.1177/2156759X18761898

Coleman, J. S. (2018).Parents, their children, and schools. New York, NY: Routledge.

Dahir, C. A., & Stone, C. B. (2003). Accountability: A M.E.A.S.U.R.E of the impact school counselors have on student achievement.

Professional School Counseling, 6, 214-221. Retrieved from https://www.jstor.org/stable/42732431

Dimitrov, D. M. (2012). Statistical methods for validation of assessment scale data in counseling and related fields. Alexandria, VA: American Counseling Association.

Dimmitt, C. (2009). Why evaluation matters: Determining effective school counseling practices. Professional School Counseling, 12, 395-399. doi: 10.1177/2156759X0901200605

Dimmitt, C., Carey, J. C., & Hatch, T. (2007). Evidence-based school counseling: Making a difference with data-driven practices. Thousand Oaks, CA: Corwin.

Erford, B. (2015). Research and evaluation in counseling (2nd ed.). Stamford, CT: Cengage Learning.

Evans, I. M. (2014). School and classroom climate. In D. C. Philips (Ed.), Encyclopedia of educational theory and philosophy (Vol. 1, pp. 736-737). Thousand Oaks, CA: Sage.

Field, A. (2018). Discovering statistics using IBM SPSS statistics. Thousand Oaks, CA: Sage.

Fisher, D. L., & Fraser, B. J. (1981). Validity and use of my class inventory. Science Education, 65, 145-156.

Fraser, B. J. (2012). Classroom learning environments: Retrospect, context and prospect. In B. J. Fraser, K. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 1191-1239). New York, NY: Springer. doi: 10.1007/978-1-40209041-7_79

Fraser, B. J., Anderson, D. L., & Walberg, H. J. (1982). Assessment of learning environments: Manual for Learning Environment Inventory (LEI) and My Class Inventory (MCI; 3rd version). Retrieved from https://files.eric.ed.gov/fulltext/ED223649.pdf

Fraser, B. J., & O'Brien, P. (1985). Student and teacher perceptions of the environment of elementary school classrooms. The Elementary School Journal, 85, 567-580. doi: 10.1086/461422

Furrer, C. J., Skinner, E. A., & Pitzer, J. R. (2014). The influence of teacher and peer relationships on students' classroom engagement and everyday motivational resilience. National Society for the Study of Education, 113, 101-123.Retrieved from https:// www.pdx.edu/psy/sites/www.pdx.edu.psy/files/2014-Furrer.Skinner.Pitzer%20%281%29.pdf

Gottfredson, G. D., Gottfredson, D. C., Payne, A. A., & Gottfredson, N. C. (2004). School climate predictors of school disorder: Results from a national study of delinquency prevention in schools. Journal of Research in Crime and Delinquency, 42, 412-444. doi: 10.1177/0022427804271931

Hallinger, P., Bickman, L., & Davis, K. (1996). School context, principal leadership, and student reading achievement. The Elementary School Journal, 96, 527-549. Retrieved from https://www.jstor.org/stable/1001848

Kaffenberger, C., & Young, A. (2013). Making data work (3rd ed.). Alexandra, VA: American School Counselor Association.

Kline, R. (2016). Principles and practices of structural equation modeling (4th ed.). New York, NY: Guilford Press.

Koth, C. W., Bradshaw, C. P., & Leaf, P. J. (2009). Teacher observation of classroom adaptation--checklist: Development and factor structure. Measurement and Evaluation in Counseling and Development, 42, 15-30. doi: 10.1177/0748175609333560

La Paro, K. M., Pianta, R. C., & Stuhlman, M. (2004). The classroom assessment scoring system: Findings from the pre-kindergarten year. Elementary School Journal, 104, 409-426. doi: 10.1086/499760

Lee, J., & Shute, V. J. (2010). Personal and social-contextual factors in K-12 academic performance: An integrative perspective on student learning. Educational Psychologist, 45, 185-202. doi: 10.1080/00461520.2010.493471

Leff, S. S., Thomas, D. E., Shapiro, E. S., Paskewich, B., Wilson, K., Necowitz-Hoffman, B., & Jawad, A. F. (2011). Developing and validating a new classroom climate observation assessment tool. Journal of School Violence, 10, 165-184. doi: 10.1080/15388220.2010.539167

Maras, M. A., Coleman, S. L., Gysbers, N. C., Herman, K. C., & Stanley, B. (2013). Measuring evaluation competency among school counselors. Counseling Outcome Research Evaluation, 4, 99-111. doi: 10.1177/2150137813494765

Mariani, M., Villares, E., Sink, C. A., Colvin, K., & Perhay Kuba, S. (2015). Confirming the structural validity of the my class inventory-short form revised. Professional School Counseling, 19, 92-102. doi: 10.5330/1096-2409-19.1.92

Mariani, M., Webb, L., Villares, E., & Brigman, G. (2015). Effect of student success skills on prosocial and bullying behavior. The Professional Counselor, 5, 341-353. doi: 10.15241/mm.5.3.341

Marsh, H. W., Hau, K. T., & Wen, Z. (2004). In search of golden rules: Comment on hypothesis testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler's (1999) findings. Structural Equation Modeling, 11, 320-341. doi: 10.1207/s15328007sem1103_2

McIntosh, K., & Goodman, S. (2016). Integrated multitiered systems of support: Blending RTI and PBIS. New York, NY: Guilford Press.

McMahon, H. G., Mason, E. C. M., Daluga-Guenther, N., & Ruiz, A. (2013). An ecological model of professional school counseling. Journal of Counseling & Development, 92, 459-471. doi: 10.1002/j.1556-6676.2014.00172.x

Melnick, H., Cook-Harvey, C. M., & Darling-Hammond, L. (2017). Encouraging social and emotional learning in the context of new accountability. Palo Alto, CA: Learning Policy Institute. Retrieved from http://learningpolicyinstitute.org/product/sel-newaccountability

Mitchell, M. M., Bradshaw, C. P., & Leaf, P. J. (2010). Student and teacher perceptions of school climate: A multilevel exploration of patterns of discrepancy. Journal of School Health, 80, 271-279. doi: 10.1111/j.1746-1561.2010.00501.x

Mvududu, N., & Sink, C. A. (2013). Factor analysis in counseling research. Counseling Outcome Research and Evaluation, 4, 75-98. doi: 10.1177/2150137813494766

National Center on Safe and Supportive School Environments. (2018). School climate measurement. Washington, DC: American Institutes for Research. Retrieved from https://safesupportivelearning.ed.gov/topic-research/school-climate-measurement

Nunnally, J. C., & Bernstein, I. H. (1994). The assessment of reliability. Psychometric Theory, 3, 248-292.

Pellegrini, A. D., & Bartini, M. (2000). An empirical comparison of methods of sampling aggression and victimization in school settings. Journal of Educational Psychology, 92, 360-366. doi: 10.1037/0022-0663.92.2.360

Ponterotto, J. G., & Ruckdeschel, D. E. (2007). An overview of coefficient alpha and a reliability matrix for estimating adequacy of internal consistency coefficients with psychological research measures. Perceptual and Motor Skills, 105, 997-1014.

Poynton, T. A., & Carey, J. C. (2006). An integrative model of data-based decision making for school counseling. Professional School Counseling, 10, 121-130. doi: 10.5330/1096-240920.1.54

Ramsey, C. M., Spira, A. O., Parisi, J. M., & Rebok, G. W. (2016). School climate: Perceptual differences between students, parents, and school staff. School Effectiveness and School Improvement, 27, 629-641. doi: 10.1080/09243453.2016.1199436

Reyes, M. R., Brackett, M. A., Rivers, S. E., White, M., & Salovey, P. (2012). Classroom emotional climate, student engagement, and academic achievement. Journal of Educational Psychology, 104, 700-712. doi: 10.1037/a0027268

Schueler, B. E., Capotosto, L., Bahena, S., McIntyre, J., & Gehlbach, H. (2014). Measuring parent perceptions of school climate. Psychological Assessment, 26, 314-320. doi: 10.1037/a0034830

Schumacker, R. E., & Lomax, R. G. (2016). A beginner's guide to structural equation modeling (4th ed.). New York, NY: Routledge Academic.

Sink, C. A. (2008). Elementary school counselors and teachers: Collaborators for higher student achievement. Elementary School Journal, 5, 445-458. doi: 10.1086/589473

Sink, C. A. (2009). School counselors as accountability leaders: Another call for action. Professional School Counseling, 13, 68-74. doi: 10.1177/2156759X0901300202

Sink, C. A. (2017). My Classroom Inventory-Short form for Parents (unpublished). Norfolk, VA: Old Dominion University.

Sink, C. A. (2019). Tier 1: Creating strong universal systems of support and facilitating systemic change. In E. Goodman-Scott, J. Betters-Bubon, & P. Donahue (Eds.), The school counselor's guide to multi-tiered systems of support (pp. 62-98). New York, NY: Routledge.

Sink, C. A., & Spencer, L. R. (2005). My class inventory-short form as an accountability tool for elementary school counselors to measure classroom climate. Professional School Counseling, 9, 37-48. doi: 10.1177/2156759X0701100208

Sink, C. A., & Spencer, L. R. (2007). Teacher version of the my class inventory-short form: An accountability tool for elementary school counselors. Professional School Counseling, 11, 129-139. doi: 10.1177/2156759X0701100208

Streiner, D. L. (2003). Starting at the beginning: An introduction to coefficient alpha and internal consistency. Journal of Personality Assessment, 80, 99-103.

Tabachnick, B. G., & Fidell, L. S. (2013). Using multivariate statistics (6th ed.). Boston, MA: Pearson.

Tavakol, M., & Dennick, R. (2011). Making sense of Cronbach's alpha. International Journal of Medical Education, 2, 53-55. doi: 10.5116/ijme.4dfb.8dfd

Thapa, A., Cohen, J., Guffey, S., & Higgins-D'Alessandro, A. (2013). A review of school climate research. Review of Educational Research, 83, 357-385. doi: 10.3102/0034654313483907

Thomas, D. E., Bierman, K. L., & Powers, C. J. (2011). The influence of classroom aggression and classroom climate on aggressive-disruptive behavior. Child Development, 82, 751-757. doi: 10.1111/j.1467-8624.2011.01586.x

U.S. Department of Education. (2007). Mobilizing for evidence-based character education. Retrieved from http://www2.ed.gov/pro grams/charactered/mobilizing.pdf

U.S. Department of Education (2018). Every Student Succeeds Act (ESSA). Retrieved from https://www.ed.gov/essa?src=rn

Van Schoiack-Edstrom, L., Frey, K. S., & Beland, K. (2002). Changing adolescents' attitudes about relational and physical aggression: An early evaluation of school-based intervention. School Psychology Review, 31, 201-216.

Villares, E., Mariani, M., Sink, C. A., & Colvin, K. (2016). Multilevel confirmatory factor analysis of the Teacher My Class Inventory--Short Form. Measurement and Evaluation in Counseling and Development, 49, 263-273. doi: 10.1177/0748175616639107

Warren, J. M., & Gerler, E. R. (2013). Effects of cognitive behavioral consultation on irrational and efficacy beliefs of elementary school teachers. The Professional Counselor, 3, 6-15. Retrieved from http://tpcjournal.nbcc.org/wp-content/uploads/2013/06/tpc-vol-3-iss-1/

Weston, R., & Gore, P. A., Jr. (2006). A brief guide to structural equation modeling. The Counseling Psychologist, 34, 719-751. doi: 10.1177/0011000006286345

Young, A., & Kaffenberger, C. (2011). The beliefs and practices of school counselors who use data to implement comprehensive school counseling programs. Professional School Counseling, 15, 67-76. doi: 10.1177/2156759X1101500204

Zyromski, B., & Mariani, M. (2016). Facilitating evidence-based, data-driven school counseling: A manual for practice. Thousand Oaks, CA: Corwin.

Author Biographies

Melissa Mariani, PhD, is an associate professor of school counseling in the Department of Counselor Education at Florida Atlantic University in Boca Raton, FL.

Christopher A. Sink, PhD, NCC, is a professor of counseling and human services at Old Dominion University in Norfolk, VA.

Elizabeth Villares, PhD, is a professor of school counseling in the Department of Counselor Education at Florida Atlantic University, FL.

Carolyn Berger, PhD, is an assistant professor in the Department of Educational Psychology at the University of Minnesota in Minneapolis.

Melissa Mariani [1], Christopher A. Sink [2], Elizabeth Villares [1], and Carolyn Berger [3]

[1] Department of Counselor Education, Florida Atlantic University, Boca Raton, FL, USA

[2] Old Dominion University, Norfolk, VA, USA

[3] Department of Educational Psychology, University of Minnesota, Minneapolis, MN, USA

Corresponding Author:

Melissa A. Mariani, PhD, Department of Counselor Education, Florida Atlantic University, 777 Glades Road, Boca Raton, FL 33431, USA.

Email: marianimelissa@yahoo.com

Caption: Figure 1. Confirmatory factor analysis model with Minnesota respondents.
Table 1. Descriptive Statistics for 25-Item Original
Questionnaire.

Item                         M      SD      Skew     Kurtosis

1. My child benefits       4.50    0.75    -2.22         7.01
from the classroom
learning
activities. (a) (S)

2. My child rarely         4.14    0.97    -1.32         1.51
has conflicts with
classmates. (F)

3. My child often          3.17    0.91    -0.04         0.02
competes with
classmates to see
who can finish their
assignments first.
(C)

4. My child sees the       3.12    1.07    -0.15        -1.02
classroom work as
challenging to
finish. (D)

5. My child is good        4.14    0.79    -1.09         1.76
friends with most
classmates. (Co)

6. My child is happy       4.43    0.72    -1.55         3.53
with his or her
classroom
experience. (S)

7. My child says           3.24    1.18    -0.19        -1.22
some classmates are
mean. (F)

8. My child wants to       2.61    1.02     0.15        -0.87
do better on his/
her assignments than
classmates. (C)

9. My child can            3.96    0.91    -0.98         0.69
easily complete most
of the class
assignments. (D)

10. My child likes         4.36    0.58    -0.55         0.93
her/his classmates.
(Co)

11. My child values        4.32    0.71    -1.04         1.51
the learning
experience in the
classroom. (S)

12. My child says          3.50    1.11    -0.62        -0.54
several classmates
cause tension. (F)

13. My child feels         3.51    0.97    -0.47        -0.39
bad when classmates
get better grades.
(C)

14. My child thinks        3.91    0.88    -0.92         1.02
only the smartest
children can do all
the classroom work.
(D)

15. My child has           4.36    0.62    -0.91         2.59
positive
relationships with
classmates. (Co)

16. My child has           4.34    0.78    -1.56         3.44
positive feelings
about his/her
classroom learning
experience. (S)

17. My child feels         3.31    0.89    -0.17        -0.12
many classmates
often want to get
their own way. (F)

18. My child thinks        3.08    0.89     0.14        -0.32
some classmates try
to get better grades
than others. (C)

19. My child reports       3.12    1.10    -0.17        -1.14
that the classroom
work is not too
complicated. (D)

20. My child gets          4.36    0.63    -0.86         1.80
along well with
classmates. (Co)

21. My child enjoys        4.41    0.69    -1.38         3.15
the classroom
activities. (S)

22. My child rarely        4.12    0.90    -1.26         1.82
quarrels with
classmates. (F)

23. My child feels         3.32    0.88    -0.15        -0.06
that many classmates
compete for top
scores on
assignments. (C)

24. My child finds         3.66    0.95    -0.75        -0.13
the classroom work
difficult. (D)

25. My child cares a       4.30    0.63    -0.77         1.70
lot for his/her
classmates. (Co)

Note. N = 669. Items ratings ranged from 1 = strongly disagree to 5
= strongly agree. SE skew = .09; SE kurtosis = .19. S =
satisfaction; F = friction; Co = cohesion; C = competitiveness; D =
difficulty.

(a) Item deleted; negatively worded items were recoded in a
positive direction.

Table 2. Item Intercorrelations for Entire Sample.

Item    2     3     4      5        6        7         8

2       --   .00   .02   .34 **   .35 **   .28 **   -.05
3            --    .04   .04      .02      .05       .40 **
4                  --    .04      .07      .06       .04
5                        --       .44 **   .26 **   -.04
6                                 --       .28 **   -.02
7                                          --        .08 *
8                                                   --
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25

Item       9          10         11        12       13       14

2        .09 *      .32 **     .26 **    .29 **   .07      .05
3       -.10 **     .03       -.02       .03      .23 **   .12 **
4        .36 **     .04        .12 **    .04      .14 **   .19 **
5        .17 **     .57 **     .37 **    .22 **   .09 *    .13 **
6        .20 **     .46 **     .65 **    .31 **   .14 **   .20 **
7        .09 *      .39 **     .19 **    .60 **   .16 **   .17 **
8       -.13 **    -.01       -.08 *     .05      .33 **   .09 *
9       --          .17 **     .30 **    .07      .21 **   .33 **
10                 --          .43 **    .40 **   .11 **   .21 **
11                            --         .15 **   .14 **   .28 **
12                                       --       .18 **   .21 **
13                                                --       .44 **
14                                                         --
15
16
17
18
19
20
21
22
23
24
25

Item      15        16        17       18        19         20

2       .32 **    .27 **    .20 **   .03       .00        .37 **
3       .05      -.05       .08 *    .17 **   -.04        .03
4       .07       .08 *     .01      .05       .22 **     .10 *
5       .60 **    .35 **    .21 **   .04       .03        .63 **
6       .47 **    .70 **    .27 **   .11 **    .04        .47 **
7       .31 **    .22 **    .42 **   .22 **    .02        .31 **
8       .01      -.04       .12 **   .26 **   -.07       -.04
9       .18 **    .27 **    .05      .07       .30 **     .23 **
10      .59 **    .42 **    .32 **   .11 **    .08 *      .61 **
11      .43 **    .69 **    .21 **   .09 *     .10 *      .44 **
12      .33 **    .24 **    .46 **   .27 **   -.01        .36 **
13      .19 **    .17 **    .24 **   .30 **    .07        .16 **
14      .27 **    .29 **    .26 **   .23 **    .12 **     .24 **
15      --        .49 **    .28 **   .12 **    .09 *      .69 **
16               --         .26 **   .10 **    .10 *      .44 **
17                          --       .46 **   -.12 **     .25 **
18                                   --       -.10 **     .08
19                                            --          .08 *
20                                                       --
21
22
23
24
25

Item       21        22        23         24        25

2        .26 **    .53 **    .03        .04       .29 **
3        .00       .04       .31 **    -.08       .01
4        .07       .04       .05        .48 **    .05
5        .40 **    .35 **   -.02        .10 **    .50 **
6        .71 **    .29 **    .10 *      .17 **    .40 **
7        .22 **    .21 **    .18 **     .11 **    .17 **
8       -.09 *     .09 *     .38 **    -.06       .00
9        .25 **    .12 **    .09 *      .59 **    .18 **
10       .45 **    .33 **    .10 *      .14 **    .48 **
11       .71 **    .26 **    .08 *      .22 **    .44 **
12       .26 **    .27 **    .24 **     .13 **    .21 **
13       .16 **    .07       .40 **     .23 **    .09 *
14       .28 **    .12 **    .33 **     .39 **    .14 **
15       .49 **    .35 **    .14 **     .16 **    .51 **
16       .72 **    .27 **    .15 **     .24 **    .40 **
17       .21 **    .16 **    .35 **     .12 **    .18 **
18       .12 **    .06       .58 **     .11 **    .14 **
19       .11 **    .05      -.07        .28 **    .03
20       .53 **    .44 **    .06        .16 **    .57 **
21      --         .33 **    .08        .20 **    .50 **
22                 --        .02        .06       .35 **
23                          --          .18 **    .09 *
24                                     --         .11 **
25                                                --

Note. N = 669.

* p < .05. ** p < .01.

Table 3. Rotated Factor Pattern Matrix for Florida Data Set.

                                       Factor

                                1                  2

Items                    Peer Relations     Competitiveness

20. My child gets              .86
along well with
classmates.

5. My child is good            .81
friends with most
classmates.

15. My child has               .76
positive
relationships with
classmates.

10. My child likes             .64                .10
her/his classmates.

25. My child cares a           .64
lot for his/her
classmates.

22. My child rarely            .38
quarrels with
classmates.

2. My child rarely             .33
has conflicts with
classmates.

23. My child feels                                .84
that many classmates
compete for top
scores on
assignments.

18. My child thinks                               .66
some classmates try
to get better grades
than others.

17. My child feels                                .52
many classmates
often want to get
their own way.

8. My child wants to           .17                .48
do better on his/
her assignments than
classmates.

13. My child feels                                .44
bad when classmates
get better grades.

3. My child often                                 .34
competes with
classmates to see
who can finish their
assignments first.

24. My child finds                                .10
the classroom work
difficult.

9. My child can
easily complete most
of the class
assignments.

4. My child sees the
classroom work as
challenging to
finish.

19. My child reports                             -.16
that the classroom
work is not too
complicated.

16. My child has
positive feelings
about his/her
classroom learning
experience.

6. My child is happy
with his/her
classroom
experience.

11. My child values
the learning
experience in the
classroom.

21. My child enjoys            .15
the classroom
activities.

Prerotation               6.37 (26.53)       2.74 (11.40)
eigenvalue (% of
variance)

Cronbach [alpha]             .81 (7)            .72 (6)
(no. of items)

                                   Factor

                              3               4

Items                     Difficulty    Satisfaction

20. My child gets
along well with
classmates.

5. My child is good
friends with most
classmates.

15. My child has
positive
relationships with
classmates.

10. My child likes                          -.10
her/his classmates.

25. My child cares a
lot for his/her
classmates.

22. My child rarely
quarrels with
classmates.

2. My child rarely                          -.12
has conflicts with
classmates.

23. My child feels
that many classmates
compete for top
scores on
assignments.

18. My child thinks
some classmates try
to get better grades
than others.

17. My child feels
many classmates
often want to get
their own way.

8. My child wants to                        -.16
do better on his/
her assignments than
classmates.

13. My child feels           .25
bad when classmates
get better grades.

3. My child often
competes with
classmates to see
who can finish their
assignments first.

24. My child finds           .80
the classroom work
difficult.

9. My child can              .73
easily complete most
of the class
assignments.

4. My child sees the         .56
classroom work as
challenging to
finish.

19. My child reports         .39
that the classroom
work is not too
complicated.

16. My child has                            -.85
positive feelings
about his/her
classroom learning
experience.

6. My child is happy                        -.80
with his/her
classroom
experience.

11. My child values          .12            -.76
the learning
experience in the
classroom.

21. My child enjoys                         -.70
the classroom
activities.

Prerotation              2.12 (8.98)     1.46 (6.07)
eigenvalue (% of
variance)

Cronbach [alpha]           .71 (4)         .88 (4)
(no. of items)

Note. n = 318. Blank cells = factor loading <.10; factor
intercorrelations ranged from .08 (Factor 2 with 3)
to -.62 (Factor 1 with 4).
COPYRIGHT 2018 American School Counselor Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:Featured Research
Author:Mariani, Melissa; Sink, Christopher A.; Villares, Elizabeth; Berger, Carolyn
Publication:Professional School Counseling
Date:Sep 1, 2018
Words:9925
Previous Article:School Counselor Experiences of Response to Intervention With English Learners.
Next Article:The Relationship Between Job Roles and Gender on Principal-School Counselor Relationship Quality.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters