Printer Friendly

Predictive modeling of student performances for retention and academic support in a diagnostic medical sonography program.

As part of a retention and academic support program, data was collected to develop a predictive model of student performances in core classes in a Diagnostic Medical Sonography (DMS) program. The research goal was to identify students likely to have difficulty with coursework and provide supplemental tutorial support. The focus was on the prediction of student performances in Acoustical Physics and Instrumentation, a class identified as problematic for students. Students' performances on the Measure of the Ability to Form Spatial Mental Imagery (MASMI: Campos, 2009). The Scholastic Level Exam (SLE: Wonder lie, Inc., 2001) and archival classroom performances were used to construct the predictive model using multiple regression. Results show a robust predicative crude/ can he developed for the targeted course and suggest such a methodology might be extended to other outcomes of students'success.

**********

Student retention and success, operationalized as the timely graduation from academic programs, has received greater attention than ever before and is increasingly becoming a measure of institutional effectiveness (Blose, 1999; Chacon, Spicer, & Valbuena, 2012; Ewell & Jones, 2006). In higher educational institutions, such attention has led to a direct linking of this metric to funding of institutional budgets and spending (Jung Cheol, 2010; Ewell & Jones, 2006; Rabovsky, 2012). This has resulted in an increased interest in identifying and supporting students at risk for failure; however, there continues to be a need for empirical and contextual specificity, i.e., who needs the support and when is it best to intervene (Project for Academic Success, 2009).

Parallel to the development of increased accountability in higher education, there has been an emergence of the field of academic analytics. Academic analytics can be defined as a hypothesis driven, action research approach used to solve specific, practical, academic problems (Baepler & Murdoch, 2010). In application, the researcher uses institutional data to build predictive models of academic outcomes. Data generated from these models is then applied to the area or problem of interest. One such problem is the prediction of student program success and retention till graduation and is the framework for the current research.

Although current indicators of future student success and retention, such as high school grade point averages and tests on entrance exams, have been historically used as predictors of student success, such variables have been identified as inadequate and not specifically relevant to individual institutions (Tinto, 2007). In a review of retention research, Tinto suggests that retention/persistence variables differ across institutional settings. For interventions to be effective, they must address local, programmatic needs. Often, individual campuses and programs have specific classes for which smdent successes and failures can function as finer grained sentinel events.

Identification of students' academic needs for remediation, and providing targeted assistance to those who need it to realize the long-term goal of timely graduation, are the purposes of the current project. As part of a retention and academic support program, it is important to be able to intervene and scaffold students proactively. Identification of students likely to have difficulty in core program classes is an important prerequisite to such intervention. As timeliness is necessary for efficacy, the earlier such identification can be accomplished the better.

Therefore, identification of specific predictive indicators is needed. Core competencies in sonography are associated with spatial ability (Clem, Anderson, Donaldson & Hdeib, 2010). In addition, measures of general ability and classroom performances are likely indicators of successful progress through the program and future success in the profession. Subsequently, three possible predictive indicators were identified. These were the scores on Measure of the Ability to Form Spatial Mental Imagery (MASMI: Campos, 2009), a measure of spatial ability; scores on the Wonderlic Scholastic Level Exam (SLE: Wonderlic, Inc., 2001), a measure of general ability, and grades in General Physics I (PHY2001; Keiser University, 2012) class, a general education requirement for the program.

As the Ft. Myers campus of Keiser University is a new campus, only a small number of students in the Diagnostic Medical Sonography (DMS) program have completed their course of studies at this time. Subsequently, data collection is on-going and preliminary with the current focus on the prediction of student performances in Acoustical Physics and Instrumentation (SON 1614; Keiser University, 2012). This class has been identified by program personal as one historically problematic to students.

Materials and Methods

With approval from the Institutional Review Board, informed consent was obtained from all participants. Participants were 36 individuals consisting of the first five cohorts of DMS students in attendance at Keiser University, Ft. Myers Campus. All members of the cohorts agreed to participate in the project. There were 33 females and three males. Ages ranged from 20 to 49. Participants were informed that the purpose of the research was to help determine if predictive indicators for program success could be identified and that their participation was entirely voluntary. There was no compensation provided for participation but the students were encouraged to do their best if they choose to participate.

Letter grades for PHY2001 and scores for the SLE for all participants were collected from student records. Letter grades were converted to a 4.0 scale for analysis. In cases when a class and/or the SLE were taken more than once, only the first attempt was recorded. The resulting scores were combined in a digital data file with student demographic data that included final grades from PHY2001 class, scores on the SLE, and the participant's scores on MASMI: Campos. Their performances were then used to construct a predictive model for academic success in SON 1614 using simultaneous multiple regression.

The MASMI was administered to each cohort group as part of a core orientation seminar. The protocols were scored by hand by the primary author. The MASMI is a timed, 23 item measure of spatial mental imagery that consists of two dimensional pictures of unfolded cubes. It can be individually or group administered with paper and pencil. The subject is required to mentally assemble (fold) the cube. Each question consists of four possible responses. Two are correct and two are incorrect. The score is determined by the total of all correct answers and then subtracting incorrect responses. Test completion is limited to a maximum of ten minutes. The author reports a Cronbach's alpha of .93 with a sample of 138 college undergraduate students averaging 20 years of age.

The SLE is an individually computer administered measure of general ability and administered via computer as part of student admission to the University. It is a timed test and consists of 50 multiple-choice and short-answer items. A score is determined by simply adding the number of correct responses completed within the 12 minute time limit. Measures of test-retest reliability of .82-.94 and alternative forms reliability of .73-.95 are reported in the manual (as cited in Geisinger, 2001). In addition, Geisinger (2001) and Schraw (2001) document validity evidence with individually administrated abilities tests.

Results

A simultaneous multiple regression analysis was conducted to determine the best linear combination of grades in PHY2001, performances on MASMI and on the SLE for predicting student performances in SON 1614. The means and standard deviations can be found in Table 1. Figure 1 shows that the independent variables are generally linear in relation to the dependent variable. Figure 2 shows the residual scatterplot which demonstrates that errors are normally distributed, variances of the residuals are constant and the residual is relatively uncorrelated with the linear combination of predictors.

This combination of variables significantly predicted student performances in SON 1614, F(3,32) = 24.84, p < .01 with each of the variables, grades in PHY2001, the MASMI raw score and performance on the SLH contributing significantly to success in SON 1614. The beta weights, presented in Table 2, suggest that good grades in PHY2001 contribute most to success, while having higher scores on the SLE and MASMI contribute to this prediction in a lesser degree respectively. The adjusted R squared value was .67. This suggests that 67% of the variance in the success in SON 1614 was explained by this model.

Discussion

The present research supports the feasibility of using existing data to create models of student success for prediction and remediation when necessary. Specifically, the current results support the use of predictive modeling for the identification of students at risk for failure in SON 1614. Such modeling can be extremely useful in apportioning limited university resources. Identification of such localized and timely predictors provides opportunities for targeted scaffolding. Subsequently, the instigation of targeted remedial instruction/activities based on such a model appears justified.

The study, using the identified variables, provides an additional way to detect specific students in a specific program who would be likely to experience difficulties before they are in distress. Remediation can be proactive instead of reactive. As the cohorts progress through the program the model can be fine-tuned for additional outcomes and ultimately a localized equation can be developed for each program milestone up to and including graduation and professional employment.

The studies greatest limitation is one that is evident in most action research, generalization (Mills, 2000). Such generalization is strictly limited by the goal of the current research, to provide a predictive model for a specific program. The methodology, however, is generalizable and in replication, could provide an accretion of evidence for model validation.

An additional limitation is gender. Due to the limited number of male participants, this variable could not be examined. Since one of the key aptitudes suggested for successful performances in the profession is spatial ability (Clem Anderson, Donaldson &. Hdeib, 2010), and performance variances have been associated with gender (Geiser, Lehmann & Eid, 2008), this may be one of the reasons the MASMI predictive effect was not as robust as expected. As the sonography program develops, a more diverse sample will be available to examine this variable and to further validate the model and increase predicative reliability.

References

Baepler, P., & Murdoch, C. (2010). Academic analytics and data mining in higher education. International Journal for the Scholarship of Teaching & Learning, 4(2), 1-9.

Blose, G. (1999), Modeled retention and graduation rates: Calculating expected retention and graduation rates for multicampus university systems. New Directions for Higher Education. 69-86. doi: 10.1002/he.10805.

Campos, A. (2009). Spatial imagery: A new measure of the visualization factor. Imagination, Cognition and Personality, 29, 31-39. doi:10.2190/IC.29.1.c.

Clem, D. Anderson, S. Donaldson, J. &. Hdeib, M. (2010). An exploratory study of spatial ability and student achievement in Sonography. Journal of Diagnostic Medical Sonography 26(4), 163-170.

Chacon, E, Spicer, D., & Valbuena, A. (2012). "Analytics in Support of Student Retention and Success" (Research Bulletin 3, 2012). Louisville, CO: EDUCAUSE Center for Applied Research. Retrieved from http:// net.educause.edu/ir/librarv/ pdf/ERB 1203 .pdf

Ewell, P. T., & Jones, D. P. (2006). State-level accountability for higher education: On the edge of a transformation. New Directions for Higher Education, (135), 9-16. doi: 10.1002/he.222

Geiser, C., Lehmann, W., & Eid, M. (2008) A note on sex differences in mental rotation in different age groups. Intelligence, 36 (6), 556-563.

Geisinger, K.F. (2001). Wonderlic Personnel Test and Scholastic Level Exam. In B.S. Plake and J.C. Impara (Eds.), The fourteenth mental measurements yearbook. Lincoln, NE: Buros Institute of Mental Measurements. Retrieved from Mental Measurements Yearbook with Tests in Print database.

Jung Cheol, S. (2010). Impacts of performance-based accountability on institutional performance in the U.S. Higher Education, 60(1), 47-68. doi: 10.1007/ s10734-009-9285-y.

Keiser University. (2012). Keiser university catalog 2012-2013. 12(1). Ft. Lauderdale, FL: Author.

Mills, G. E. (2000). Action research: A guide for the teacher researcher. Upper Saddle River, NJ: Prentice-Hall, Inc.

Project on Academic Success. (2009). How colleges organize themselves to increase student persistence: Four-year institutions. New York, NY: College Board. Retrieved from http://advocacy.collegeboard.org/sites/default/files/09_0869_CollRe ten_ALL_WEB_090407.pdf

Rabovsky, T. M. (2012). Accountability in Higher Education: Exploring Impacts on State Budgets and Institutional Spending Patterns. Journal of Public Administration Research & Theory, 22(4), 675-700.

Schraw, G. (2001). Wonderlic Personnel Test and Scholastic Level Exam. In B.S. Plake and J.C. Impara (Eds.), The fourteenth mental measurements yearbook. Lincoln, NE: Buros Institute of Mental Measurements. Retrieved from Mental Measurements Yearbook with Tests in Print database.

Tinto, V. (2007). Research and practice of student retention: What next? Journal of College Student Retention, 8(1), 1-19.

Wonderlic Personnel Test. (2001). In B.S. Plake and J.C. Impara (Eds.), The fourteenth mental measurements yearbook. Retrieved from the Burros Institute's Mental Measurements Yearbook with Tests in Print database.

Dr. Borghese is a Student Support Program Coordinator and psychology faculty at Keiser University

Ms. Lacey is a Diagnostic Medical Sonography Program Director at Keiser University.

Table 1
            Descriptive Statistics

Variable    Mean    SD      N

SON 1614    77.08   5.61    36
MASMI       16.47   14.21   36
SLE         23.56   4.35    36
PHY200I     2.08    1.13    36

Table 2. Simultaneous Multiple Regression Analysis Summary
for SLE, MASMI. and PHY2001 Predicting SON 1614

Variable      B     SEB      B

MASMI        .10    .04    .25 *
SLE          .76    .16    .50 **
PHY200I     2.99    .51    .60 **
Constant    51.45   4.01

Note. R2 = .70, F(3, 32) = 24.84,
p < .01. * p < .05, ** p <.001.
COPYRIGHT 2014 Schoolcraft College
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2014 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Borghese, Peter; Lacey, Sandi
Publication:Community College Enterprise
Article Type:Report
Date:Mar 22, 2014
Words:2193
Previous Article:Remarks from the editor.
Next Article:It's who you know: leveraging social networks for college and careers.
Topics:

Terms of use | Privacy policy | Copyright © 2019 Farlex, Inc. | Feedback | For webmasters