Printer Friendly

Adjunct Online Faculty and Online Student Grades.

INTRODUCTION

In 2013, more than 4,000 private forprofit 4-year online institutions offered degree programs and courses that are identical in curriculum to traditional campus-based programs and courses (Ginder & Sykes, 2013). In 2013, over 61% of online students in the United States were enrolled in private for-profit 4-year online institutions (Ginder & Skyes, 2013). The TPACK (Technological Pedagogical Content Knowledge) theoretical framework (Appendix A) measures responses of instructors in three main areas, technology, pedagogy, and content knowledge, and four other domain areas where the constructs technology, pedagogy, and content knowledge overlap. TPACK is comprised of seven areas related to technology, pedagogy, and content knowledge. The TPACK Survey tool measured responses of adjunct online faculty working at a private forprofit online institution.

LITERATURE REVIEW

Literature collected and analyzed online instructional training, areas of TPACK, online experience, and student retention and success. A myriad of data and research documents discussed faculty preparation, professional development, online training, student retention, and student satisfaction with courses, professional development, and online learning theory. Another theme found in the literature was faculty development and evaluation.

PURPOSE

This study's findings may provide direction on how to prioritize faculty assessment and professional development in the distance learning sectors for any institution offering courses online. A closer look at adjunct online faculty and online student grades could help identify and formulate solutions for faculty, so that retention and better grades may be achieved. Information collected may assist in supporting online departments with understanding and characterizing the domains of adjunct online faculty that may predict the grades of online students. The following research questions guided the study:

1. What is the relationship between online student grades and self-reported technological knowledge (TK), pedagogical knowledge (PK), and content knowledge (CK) of adjunct online instructors?

2. What is the relationship between the online student grades and self-reported pedagogical content knowledge (PCK), technological content knowledge (TCK), and technological pedagogical knowledge (TPK) of adjunct online instructors?

3. What is the relationship between self-reported pedagogical content knowledge (PCK), technological content knowledge (TCK), and technological pedagogical knowledge (TPK), and self-reported TPACK (Technological Pedagogical Content Knowledge) of adjunct online instructors?

4. To what extent, if any, is self-reported TPACK (Technological Pedagogical Content Knowledge) of adjunct online instructors related to online student grades?

METHODOLOGY /DESIGN

This study used a quantitative, correlational, nonexperimental research design to respond to the research questions and hypotheses. A quantitative approach was appropriate for this study because the data collected about online students (grades) and adjunct instructors (TPACK) were not qualitative. Quantitative correlational studies use measures of association between variables to help describe the significance and relationship of the data. Nonexperimental research was appropriate for this study because the participants were not under any treatment, control, or influence (Vogt, 2007). Measures of association did not describe cause and effect, as the intent of this study was to assess an association and relationship between the predictor variables and the criterion variable.

SETTING/POPULATION/SAMPLING

A TPACK Survey (Schmidt et al., 2010) was distributed via web to 148 anonymous adjunct online faculty at a private forprofit online institution in the United States in January 2016 via a SurveyMonkey. com link. Potential participants were given an option to volunteer and participate in the survey; alternatively, each potential participant could opt-out and exit without any penalty or obligation. Eighty-one adjunct online faculty members participated in the TPACK Survey (Schmidt et al., 2010) for a response rate of 57%. The population size, particularly adjunct online instructors at one private for-profit online institution, made this study reliable because sample sizes suggested for the study were large enough to generalize the results.

INSTRUMENT

The TPACK Survey was created by Schmidt et al. (2010) to measure each of the seven domains of the TPACK. Technological knowledge, pedagogical knowledge, and content knowledge, and overlapping areas were included in the survey. The selection of this tool was relevant to the literature regarding the predictor variables (technological knowledge, pedagogical knowledge, and content knowledge and overlapping areas) of adjunct online instructors. The TPACK Survey (Schmidt et al., 2010) questions pertaining to adjunct online technological knowledge, pedagogical knowledge, and content knowledge and overlapping areas were divided into groups. The three predictor variables (technological knowledge, pedagogical knowledge, and content knowledge) were ordinal data measured on a 5-point Likert scale. Student grades were measured according to the average points received in courses. Data released included student percentages earned for the courses instructed by the participating instructors, which were treated as a ratio variable, from 0 to 100, with higher numbers indicating a higher proficiency within the course.

VALIDITY AND RELIABILITY

Researchers have established high levels of internal validity and internal consistency of the TPACK Survey (Schmidt et al., 2010). Schmidt et al. (2010) established acceptable internal consistency by observing Cronbach's alpha between 0.75 and 0.92 for each of the seven overlapping constructs of the TPACK survey (Koehler et al., 2014). Replication to other studies that utilize the same research design indicates a reliable research technique because the outcomes produced similar results. All data were evaluated for missing data and extreme cases. The existence of extreme cases, or outliers, was assessed by the evaluation of standardized values. The standardized values were generated for each of the TPACK scores, and each individual case was inspected for values above 3.29 and below -3.29 (Tabachnick & Fidell, 2012). All cases with missing data were evaluated for patterns that were random. Volunteer participants who did not complete major sections were excluded from the final sample.

RESULTS AND DISCUSSION

The population for this study was adjunct online faculty members at a private forprofit virtual institution in the United States. The faculty sample that received the questionnaire was representative of the population of interest. The institution shared that 55% of the population were male and 45% of the population were female. Regarding age, the values were between the ages of 30-59 as the institution reported. With regard to race, 75% of the population were white, and 25% were black, other, or from multiple races. Degrees held for the population were 89% with master's degrees and 11% with doctorate degrees. From the data analysis, the majority of respondents (53.1%, n = 43) were males and 46.9% (n = 38) were females. Regarding age, 48.1% (n = 39) were 18-44, and the remaining 51.9% (n = 42) were 44 years of age or older as presented in Table 1.

Regarding race, the majority of adjunct online faculty was white (72.8%, n = 59), Asian (6.2%, n = 5), and black or African American faculty (6.2%, n = 5) was equally distributed. Faculty race is presented in Table 2.

Eighty-five percent (n = 69) of adjunct online faculty reported that they held master's degrees and 15% (n = 12) of adjunct online faculty held doctorate degrees. Regarding online teaching experience, 8.6% (n = 7) of adjunct online faculty had one year or less; 32.1% (n = 26) had 1-4 years, and 59.3% (n = 48) of adjunct faculty had 5 years or more. Online teaching experience is presented in Table 3.

The annual amount of professional development in the areas of technology, content knowledge, and pedagogy for adjunct online faculty was determined by three questions on the survey. Frequency responses ranged from 1 (no professional development for the year) to 4 (at least four professional development events per year). The highest area for professional development endorsement was content knowledge (mean score of 2.68) with more frequent professional development in that area. Based on all the responses, participants reported that they completed professional development activities two to three times a year as presented in Table 4.

Student grades for each adjunct online instructor ranged from a mean of 14.5 to 100 (M = 68.07, SD = 17.51). Using a traditional grading scale, adjunct faculty who volunteered to take the survey had student grade averages in their courses of a "D" (M=68 or 68%). Adjunct faculty participants had the highest endorsements in the areas of content knowledge (M = 4.37, SD = 0.91) and pedagogical knowledge (M = 4.12, SD = 0.91). Faculty participants had the lowest endorsements in the areas of technological pedagogical knowledge (M = 3.71, SD = 1.01) and technological knowledge (M = 3.90, SD = 0.88). Descriptive statistics are presented in Table 5.

Participants reported their strengths or weaknesses in the areas of common technologies such as digital books, internet and digital video, learning management system, software, and the methods to present information, and current newer technologies that are newly developed (Mishra & Koehler, 2006). The participants perceived their strengths or weaknesses in process and practice or methods of instruction, such as purpose, standards, methods to instruct, and strategies for student learning to occur and subject matter being taught and structure of knowledge, or what the instructor knows about a subject that they instruct (Mishra & Koehler, 2006). Also, participants perceived their usage of technological knowledge to enrich content for effective instruction and blend of pedagogy and content knowledge to instruct effectively (Shulman, 1986) while utilizing technological knowledge in order to enrich pedagogy for effective instruction (Mishra & Koehler, 2006). The results indicated that online instructor's TPACK did not predict student grades. Technology, pedagogy, and content knowledge practices prove to promote instructional knowledge, a better understanding, and purpose for effective instruction according to the TPACK theoretical framework in this study.

There are areas of faculty preparation, training, and professional development that grant the importance of online learner grades, achievement, and improved methods of assessing online learners. Online learners' grades achieved within online courses may not be an accurate measurement of student-learning outcomes in virtual learning environments. The methods of learners' performance evaluation within online courses may need to be addressed differently than using a traditional grading system within current virtual institutions. An alternate method of evaluation in online courses may need to be developed, so students are evaluated differently. Perhaps, creating a different assessment system for online learners may replace current online grading systems in the future.

Content knowledge can imply that professional development activities that focus on areas of technology and pedagogy could be enriched, so learners achieve better grades with close attention to content and purposive activities. Professional development activities that partake in actions that involve technology, pedagogy, and content knowledge can generate meaningful learning (McKee et al., 2013; Palloff & Pratt, 2011).

Institutional leaders may consider a closer examination of curriculum, design, course development, instructional quality, responsibilities and attitudes, human resources, recruitment, cycles of professional development, faculty effectiveness, and experience in delivering instruction. Regarding the growth of online courses and part-time online instructors instructing online courses, focusing on part-time faculty members requires exhaustive examination. Higher education leaders can continue to foster student learning and address the concern for student success and retention rates while ensuring that part-time online faculty are applying the seven areas of TPACK within their courses.

RECOMMENDATIONS FOR PRACTICE

Higher education stakeholders can plan to provide different methods of faculty orientation based on the TPACK framework for new part-time online faculty members. Perhaps, orientation can prepare them with competency in the seven areas of technology, pedagogy, and content knowledge for application in their future courses with student learning outcomes in mind. Orientation for new appointees would not just introduce how the online courses and learning management system function. Orientation would also demonstrate how to create lessons that promote usage of the TPACK for student mastery of the learning outcomes, not measured by grades achieved by students but by measuring performance within specific tasks to complete. The TPACK framework supports student learning outcomes, and the findings of this study indicated that all areas of the constructs of technology, pedagogy, and content knowledge had a relationship to one another (TPACK). Part-time online faculty orientation may include hands-on practice, technology assessment, content analysis, mentoring, and mock courses that require specific prompts that trigger TPACK mastery with close observation. Participants in the orientation could be evaluated by other members and given opportunities to choose how they are applying all areas of the TPACK domains to their lessons. Lastly, orientation may focus on how part-time online faculty can better assess and award learners' grades, so online learners receive an accurate record of their learning accomplishments.

Stakeholders and department chairs can support, develop, and promote professional development that addresses the TPACK with specific activities that can supplement the talents of existing part-time online instructors. Part-time online faculty can participate in professional development through newer methods such as identifying where part-time faculty are the weakest and build on activities that would help in improving skills that are not often practiced. When professional development frequency was assessed, findings indicated professional development activities involving content knowledge were the most practiced by part-time online instructors. The least practiced areas of professional development that part-time instructors participated in were technology and pedagogical activities. Activities that blend both technology and pedagogy could be created and specially designed to increase performance. Professional development could be ongoing to address specific areas. Although there was no relationship between TPACK and student grades, stakeholders in higher education can still integrate professional development activities that improve online learner experiences and achievement. Professional development can still be student centered. Hence, professional development activities that can provide instructors better methods of assessing and interacting with online learners may support the achievement of better grades.

Online faculty training usually occurs before courses are instructed. Online training for part-time online faculty can be enhanced by applicable training. Content and internal processes continuously change, so training should be ongoing. Department leaders can create customized faculty training rubrics, best practices, and specific prompts for a variety of subjects. Identifying constraints and addressing them within training is recommended. Evaluation of certain performances, actions, or tasks could lead to a better understanding of what types of training could promote better mastery of online instruction. Training that utilizes different methods of assessment of online faculty members that concentrates on the difficulties of online instruction may benefit online instructors. Alternatively, training that focuses on the improvement of techniques in assessing student learning may produce a better assessment of online learner grades.

Faculty evaluation methods with close regard to student-learning outcomes could help to illuminate the effects of student performance with an addition of different paradigms that may measure different areas to study. Frequency of faculty evaluation is vital to ensure instructors are remaining competent and keeping up with the continuous changes that occur in online learning environments, at the end of each course and monitoring during the course each week is a suggestion. Perhaps department chair evaluation of faculty could include review of course grade distributions and grading rubrics. Faculty evaluation can be performed within the courses and can be linked to course data and student learning outcomes. Full participation of online learners would be necessary to evaluate faculty using a newer faculty evaluation method.

CONCLUSION

Although this study indicated no correlation between TPACK and student grades, more research is recommended to illuminate the relationship for more accurate results to be drawn. The goal was to investigate the relationship between adjunct online faculty and student grades. Previous instructor participants in TPACK research studies emphasized that preservice instructors, or new instructors that were predominantly first year developing instructors with little or no instructional experience and newly introduced to the profession.

Findings from this study regarding data about professional development indicated that professional development involved content knowledge activities, as this was the highest score. The results supported previous studies' information that indicated TPACK assessment could enrich the content for effective instruction and blend pedagogy/content knowledge to instruct effectively while utilizing technological knowledge to enrich pedagogy for effective instruction (Mishra & Koehler, 2006; Schmidt et al., 2010). Prior TPACK studies did not evaluate the frequency and areas of professional development that adjunct online instructors participated in annually. Participants in this study were not given choices within the research questions to assess other variables that may be part of their online profession. Such variables may include assessment of the constraints in the learning environment, the learning management system, and the ability to discuss their understanding of their technology, pedagogy, and content knowledge.

The TPACK assessment tool might not yield enough variables to correspond with high or low student grades, which is a limitation. A side observation extracted from this study was that the current research supported the study's results regarding low online student grades. The findings from this present study affirmed that students who attend online courses achieve lower grades as the results of this study indicated that 68% was the average online students' grades. The findings of this study supported previous research regarding low online student grades and the growth of adjunct online faculty (Allen & Seaman, 2013; Harris & Martin, 2012; Mishra et al., 2009; Porras-Hernandez & Salinas-Amescua, 2013). The information generated in this study could lead to exploration and future modifications to existing procedures in the assessment of online students, evaluation, professional development, and training of adjunct online instructors, and in the augmentation of TPACK assessment methods and administration. Higher education stakeholders have the opportunity to reflect on how they are addressing the areas of technological, pedagogical, and content knowledge of their adjuncts and even full-time faculty who instruct online courses, mainly for reasons other than concern for the impact on student grades. In summary, the present study indicates that adjunct online instructors TPACK are not necessarily correlated with student grades, but many other variables are discussed that have some relationship to the online educational milieu.

REFERENCES

Allen. I. E., & Seaman, J. (2013). Changing course: Ten years of tracking online education in the United States. Babson Survey Research Group.

Ginder, S., & Sykes, A. (2013). Characteristics of exclusively Distance education institutions, by state: 2011-2012 (NCES 2013-172). http://nces.ed.gov/pubsearch

Harris, J. B., & Hofer, M. J. (2011). Technological pedagogical content knowledge (TPCK) in action: A descriptive study of secondary teachers' curriculum-based, technologyrelated instructional planning. Journal of Research on Technology in Education, 43(3), 211-229. https://doi.org/10.1080/15391523.2011.10782570

Harris, H. S., & Martin, E. W. (2012). Student motivations for choosing online classes. International Journal for the Scholarship of Teaching and Learning, 6(2). http://digitalcommons.georgiasouthern.edu/ij-sotl/vol6/iss2/11/

Leedy, P. D., & Ormrod, J. E. (2010). Practical research: Planning and design (9th ed.) Prentice-Hall.

McKee, C. W., Johnson, M., Ritchie, W. F., & Tew, W. M. (2013). Professional development of the faculty: Past and present. New Directions For Teaching & Learning, 2013(133), 15-20. https://doi.org/10.1002/tl.20042

Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108, 1017-1054. https://doi.org/10.1111/j.1467-9620.2006.00684.x

Mishra, P., Koehler, M., & Kereluik, K. (2009). The song remains the same: Looking back to the future of educational technology. TechTrends: Linking Research & Practice to Improve Learning, 53(5), 48-53. https://doi.org/10.1007/s11528-009-0325-3

Porras-Hernandez, L. H., & Salinas-Amescua, B. (2013). Strengthening TPACK: A broader notion of context and the use of teacher's narratives to reveal knowledge construction. Journal of Educational Computing Research, 48, 223-244. https://doi.org/10.2190/ec.48.2.f

Schmidt, D. A., Baran, E., Thompson, A. D., Mishra, P., Koehler, M. J., & Shin, T. S. (2010). Technological pedagogical content knowledge (TPACK): The development and validation of an assessment instrument for preservice teachers. Journal of Research on Technology in Education, 42(2), 123-149. https://doi.org/10.1080/15391523.2009.10782544

Shulman, L. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14. doi:10.3102/0013189x015002004

Tabachnick, B. G., & Fidell, L. S. (2012). Using multivariate statistics (6th ed.). Pearson. Vogt, P. W. (2007). Quantitative research methods for professionals. Allyn & Bacon.

APPENDIX A

Wendy Kaaki, American College of Education, 101 W. Ohio St, Suite 1200, Indianapolis, IN 46204

Email: wendy.kaaki@ace.edu
Table 1. Age of Adjunct Online Faculty

Age    n     %     Cumulative %

18-29   1    1.2       1.2
30-44  38   46.9      48.1
45-59  30   37        85.2
60+    12   14.8     100.0
Total  81  100.0

Table 2. Race of Adjunct Online Faculty

       Race                n     %    Cumulative %

White                      59   72.8     73.8
Black or African American   5    6.2      6.3
Asian                       5    6.2      6.3
Native Hawaiian or other    1    1.2      1.3
Pacific Islander
From multiple races        10   12.3     12.5
Not answered                1    1.2
Total                      81  100

Table 3. Online Teaching Experience of Adjunct Online Faculty

Teaching Experience  n     %    Cumulative %

1 year or less       7     8.6    8.6
1-4 years            26   32.1   40.7
5-10 years           27   33.3   74.1
10 or more years     21   25.9  100
Total                81  100

Table 4. Professional Development

      Professional Development           N  Minimum  Maximum   M   SD

Technology
  As an online instructor, how often do  81    1        4     2.4  1.1
  you practice professional development
  that is specialized in technology?
Content/Knowledge
  As an online instructor, how often do  81    1        4     2.4  1.1
  you practice professional development
  specialized in the content that you
  instruct?
Pedagogy
  As an online instructor, how often do  81    1        4     2.4  1.1
  you practice professional development
  specialized in the pedagogy of online
  instruction?

Note: 1 = none, 2 = once or twice a year, 3 = 3 times a year, 4 = At
least four times a year.

Table 5. Descriptive Statistics

  Professional Development           N   Minimum  Maximum    M      SD

Student grade                        81    14.5     100    68.1   17.5
Technological Knowledge (TK)         81     1         5     3.9    0.88
Content Knowledge (CK)               81     1         5     4.37   0.91
Pedagogical Knowledge (PK)           81     1.71      5     4.12   0.91
Pedagogical Content Knowledge        81     1         5     4.15   0.92
Technological Content Knowledge      81     1         5     3.95   1.19
Technological Pedagogical Knowledge  81     1.33      5     3.71   1.01
Technological Pedagogical Content    81     2         5     4.04   0.88
Knowledge (TPACK)
COPYRIGHT 2020 Information Age Publishing, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2020 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Kaaki, Wendy
Publication:Distance Learning
Date:Jan 1, 2020
Words:3636
Previous Article:Translating Theory to Practice: Applying Systems Thinking to the Design of Professional Development.
Next Article:Is Technology in the Physical Therapy Classroom a Fad or an Asset?
Topics:

Terms of use | Privacy policy | Copyright © 2022 Farlex, Inc. | Feedback | For webmasters |