Printer Friendly

Using alumni and student databases for program evaluation and planning.

This article describes the process used to identify students in and alumni of an instructional design master's and doctoral program in order to evaluate the effectiveness of this program. Two databases were created on these two groups and then later used to develop to datasheet surveys, which were the Survey of Students and the Survey of Alumni. The Student survey was distributed during the new student orientation and via the program's listserv in fall and spring semesters. The Alumni survey was distributed via the US postal service and, again, through the program's listserv during spring semester only. Of the 225 possible participants, approximately 25% of the surveys were returned and used in the data analyses. Results indicated an overall program satisfaction from both students and alumni. Reported weaknesses were used in strategic program planning to implement changes to the curriculum.

Introduction

The purpose of the project was to establish databases of the students and alumni and then survey them for their impressions of the Program. The information gathered was used to identify issues related to instructional design (ID) master's and doctoral level programs at a Southeastern university. Additionally, the information gathered from these same audiences assisted in evaluating the programs and facilitating strategic planning by the program faculty members. Ultimately, the information gained will serve as a benchmark in order to identify any trends with the student populations and within the ID programs themselves.

The purpose of this article is to describe the process used to track students and alumni in a less than accommodating environment at the time the databases were established. In addition the results of the information gathered will be shared in light of identifying the types of information to collect when developing databases, the types of reports that facilitate program evaluation and strategic planning, and the types of issues, in general, that were found for our particular programs. Specific details will only be shared when relevant to developing, reporting, and affecting program planning and evaluation because while relevant to our particular ID program, the details may be less valuable to others.

Why gather information on the students and alumni?

Accreditation of Universities, Colleges, Programs, and Professionals

Most accreditation bodies of higher education institutions and programs require that programs assess their effectiveness. These accreditation processes often require self-study of individual programs as well as the institution in and of itself. Part of this self-assessment is based on information about the students and those who have graduated from the program. For instance, this University has just completed a 3-year review by its regional college accrediting agency, Southern Association of Colleges and Schools (SACS) and the College of Education (COE) in the University is currently undergoing a review by the National Council for Accreditation of Teacher Education (NCATE).

SACS is the regional accrediting agency for 11 states in the Southeastern U.S. and in Latin America and is one of six such agencies within the U.S. that is recognized by the U.S. Department of Education (SACS, 2004). NCATE is a national accrediting body for schools, colleges, and departments of education authorized by the U.S. Department of Education. NCATE determines which schools, colleges, and departments of education meet rigorous national standards in preparing teachers and other school specialists for the classroom (NCATE, 2004).

Additionally, professional associations or organizations may review programs within colleges and universities for accreditation and based on standards established by the profession, field or discipline. Such professional accreditations may credential or certify a college, program, and its graduates as meeting their established Standards. For example, the American Bar Association and its state affiliations credential law schools and recognize the graduates as members once they pass the exam. For the instructional design profession, the professional group, International Board of Standards for Training, Performance, and Instruction (IBSTPI) has established competencies for designated professions (Richey, Fields, & Foxon, 2001); however, they have yet to establish procedures for accrediting programs or the individual profession (Davidson, 1985; 1987; Davidson-Shivers & Barrington, in preparation; Gustafson, 2002).

No matter which type of accreditation agency is involved, accreditation studies require that faculty and administrators document the effectiveness (or lack of) with data. For instance, regional college accreditation agencies requires information on graduate program effectiveness in specific areas, such as currency of curriculum, innovation of teaching, advising and orientation of students. Additionally, it requires that programs be able to assess progress of students (2001 Institutional Self-Study; 2001 ID Graduate Programs Report for Self-Study). Such specific information requires data on and from the students and alumni in addition to faculty.

Providing data for accreditation is one reason for obtaining data and keeping records of student progress. Another is to monitor students as they progress through their programs of study toward completion of their degrees.

Monitoring Student Progress for ID Program Planning

According to Underwood, Nault and Ferguson (1994), good sources of information are databases and surveys. Others would concur (Kern, 1990; Rollman, 1991; Aviles, 2000, Delancey, 1995). Developing student and alumni databases as well as the surveying them allow programs to collect pertinent information in order to evaluate current policies and practices and plan for the future. Databases and surveys allow program faculty to systematically monitor the progress of students and completion rates. Nerad and Miller (1996) suggest that such monitoring is the first step toward student retention and urge that such monitoring be accomplished on an annual basis. That is, if programs are able to clearly identify students' progress, then they should be able to document student attrition as well by identifying those students who are not making progress toward completing their degrees and have left the program. Furthermore, this documentation would allow the program faculty to focus their advising of these students on identifying potential roadblocks that could prevent students from completing their studies.

For the ID program, it was very difficult to track the students and monitor their individual programs of study due to the University's old, inadequate registration and database systems. At the time the study was conducted, the registrar's office could only issue data for the current term but not past or future listings. These data sheets, such as advisor lists, were often faulty and incomplete since the data only pertained to those students enrolled in any given current semester

and not those enrolled in the program. Furthermore, the registration system was not capable of pre-registration. This lack of pre-registration did not allow for scheduling of courses for the immediate upcoming term, effective curriculum planning in the near future or projecting program needs in the long-term. Any attempts to use information sent from the Registrar's office for decision-making and strategic planning became cumbersome, if not impossible.

Likewise, the institution's Alumni office did not always have complete or current information on alumni for any given specific program due to antiquated data systems for similar reasons as well as lack of resources to maintain accurate, up-to-date databases. (Luckily, these systems have been updated in the past year and now provide for pre-registration, improved student advising and monitoring capabilities, and program planning opportunities.)

Hence, it became imperative that the ID program develop a system for tracking alumni and the students in order as well as plan for the future. It was decided that developing our own databases and surveys would be the best resources for this information. The results could yield information that could be used in evaluating, planning and decision making efforts as well as with the institution's reaccredidation process.

Data Sources and Methods

Participants

A total number of possible participants was 225 (with approximately 100 being students enrolled in the programs & 125 alumni who had received their degrees). Of the student body, only 55 (30 were in the master's program and 25 in the doctoral) responded to the survey. For the alumni, we were only able to obtain names and addresses for a small portion of them. However, of those 60, sixty-five percent of the alumni responded to and returned the datasheets (24 had earned a master's degree and 18 had earned their doctorates).

Data Collection Procedures

The students were asked to complete the datasheet surveys once a year in the fall and then updated them annually as long as they were in the ID program. These datasheets were distributed at the New Student Orientation meeting for incoming students (during fall and spring semesters). They were distributed to returning students already in the program via the program's listserv annually during the fall semester only. Students were directed to return these datasheets to the program's administrative secretary.

To gather information about alumni of the program, the names and addresses were obtained from two main sources: a) exit surveys for graduating students and b) records from the University's alumni office; these datasheet surveys were distributed via the US postal service with a return envelope provided. The alumni datasheet surveys were distributed via the program's listserv asking them to return them to either the administrative secretary or the principal investigator during the spring semester on annual basis. The exit surveys were provided to students as they completed their degree program.

Data Analyses

The information was entered into a database established in Microsoft [R] Access for both sets of surveys. Information from the databases was then analyzed using Microsoft [R] Excel or Statistical Program for Social Sciences 9.0 (SPSS Inc., 1999). Data that contained any identifiers were removed or separated from the demographic parts of a survey prior to coding and analysis. Information from any individual was kept confidential. Any reporting of data and information was in aggregate form.

Procedures for Reporting Results

Information based on the datasheet surveys were reported using tables or charts. Reports on the results were made at the ID program's Annual Faculty Retreat, Advisory Council meeting, and/or at a meeting of a scholarly organization. The reports were sent to the Graduate Dean and COE Dean for inclusion into reports to the various accreditation agencies.

Survey Instruments

Two datasheet surveys, Survey of Students and Survey of Alumni, were developed by the program coordinator with the assistance of three graduate students. The participant directions for both surveys stated that information would be kept confidential and any reporting of data would be in aggregate form and not for any given individual. Access to the datasheets and databases was limited to only those involved in data entry, analysis, and reporting.

Survey of Students. The first part of this survey asked students to provide demographic data and an accounting of their progress toward their degree. This portion included the major milestones for the master's program as well as the doctoral program. Students were to mark what they had accomplished and when. Some examples are filing their programs of study, developing their internship proposals, preparation for the comprehensive exams.

The second part, an Exit Survey (containing 11 items), occurred at a later date and asked students to evaluate the program upon completion of their comprehensive exams or as they obtained their degree. Students responded to items that asked them to evaluate the new student orientation meetings, advising process as well as the program in general. Several open-ended items asked students general questions about what they found most or least beneficial, how the program could be improved and allowed for other comments. (Note: Due to small numbers of students at the final stage of their programs, only a few completed the survey administered at the end of the program).

Survey of Alumni. For the alumni, the survey had two parts using items that could be checked. Very few items on this survey were open ended in order to allow for ease and speed in responding. The first part (containing 11 items) asked students to provide demographic information about their addresses, current employment and location, what program they completed and when, etc. The second part (containing 26 items) asked students to evaluate the program, if their current jobs related to instructional design, and if they would recommend the program to others.

Personal items such as salary range and employee level were asked. Although Werner (1983) noted that individuals would be more likely to complete a survey if questions were relevant and did not make them uncomfortable, we asked these questions anyway. However, this second part contained no identifiers and could be returned separately and hence, allow some comfort level for those respondents concerned about anonymity and information remaining confidential.

Results

Student Survey Results

The results of the student survey indicated that of the 55 students who completed the survey, 36 were female and 19 were male. In terms of cultural or ethnic diversity, the results indicated that the majority were Caucasian American (n= 37), 6 were African American, 6 were Asian (Thai, India), with 2 Hispanic. See Table 1 for further information on Students of the ID Programs.

Alumni Survey Results

The results indicated an overall satisfaction of the program from students just completing their degree and with alumni who had been away for the program for a few years. The majority of alumni (both master's and doctoral) indicated that their jobs were related to ID and that their ID background helped them obtain their current position. Salaries were fairly dispersed within the various levels for both master's and doctoral alumni. The median salary range for alumni with master's degrees (n=7) was $40,000 - 49,999 whereas the alumni with earned doctorates the median salary ranges was $60,000 - 69,999 (n=4). Additionally, the majority indicated that they were 'satisfied' to 'very satisfied' with their current jobs and most would recommend the ID program to others. See Table 2 for further information on the Survey of Alumni.

Based on the open ended items and the other comment item, the alumni commented on the strengths and weaknesses of the program. In general, both master's and doctoral alumni were positive about the currency of knowledge of the faculty. Likewise they had positive comments by stating that faculty were approachable and supportive of students. Some of the weaknesses mentioned included their inability to get into courses (due to being full at registration or not being offered when needed or desired) and unanticipated changes in course scheduling that made it difficult to complete their programs of study in a timely manner. These identified difficulties translated into needs of the program. In turn these identified needs helped to strengthen the program's arguments for additional faculty and other resources.

Discussion of the Results

The results (shown in Tables 1 and 2) highlight the demographics of the student body for this point in time data and identified employment status of those alumni responding to the survey. The strengths and weaknesses of the program from both a close-up (the student surveys) and a long-term (alumni) view of the ID program were identified, documented, and reported to accrediting agencies, advisory council, and the faculty.

The databases were then used to begin monitoring the students' progress over the next year. However, due to lack of resources, the faculty member was only able to maintain the databases of students and alumni for another year or so. The University did institute and implement a comprehensive registration system that should allow for better maintenance of student records, scheduling of courses, etc., which may assist the Program in monitoring its current student body and conduct strategic planning.

Of the surveys that were used, only the second part of the student survey, the exit survey, is currently being used to indicate student satisfaction with the program as they graduate; the information is included as part of the ID program's reporting to accrediting agencies. Conversely, the plans to continue surveying alumni on an annual basis failed to take place, which may be due, in part, to lack of resources. The follow-up of the alumni remains necessary to the integrity of the program, the ability of students and faculty to develop and maintain an ID network, and demonstrate the Program's success for the accrediting agencies. Hopefully, the alumni survey will be resurrected, which means that development of a current database must occur once again.

Summary

The datasheet and surveys were carried out to learn about students' and alumni's satisfaction with their education and their perceptions of its utility. The results from these data sources proved to be quite useful because it allowed the faculty to assess the ID program's effectiveness for accreditation, begin tracking students in their progress toward their degrees, maintain contact with alumni, and plan for necessary improvements to the master's and doctoral programs. Although specific details of the results of the surveys are of interest and pertinent to the ID program at hand, they may be less relevant to others. Hence, the importance of this discussion lies in describing the process used to locate and obtain information about students and alumni in a less than accommodating environment with these results being used to identify types of data to collect when creating a database and the surveys.

Developing surveys and databases allowed this ID program to document a point in time information regarding their student body and to survey alumni on their professional status as well as comment on the education that they received. The databases were a starting point for the program faculty to monitor the student progress and completion rates and to assess the status of the master's program and the doctoral programs at the time. Additionally these results were good information about each of the programs, the services offered to the students, and identifying areas needing improvement, which could be used for future planning. Furthermore, they provided essential information required by accrediting agencies such as employment of students following graduation, ethnicity, responsiveness of the program to student needs, etc.

Moreover, any program that wishes to remain current and vital within the field must evaluate the program in order to strengthen it. To assist in evaluation and planning, documented information needs to be obtained and used. Databases in conjunction with surveying students about their perceptions of the program will help provide some of that information. However, to understand the changes of any program, such databases need to be maintained and updated on a regular basis and there in lies a problem. Most faculty do not have the time to update records and maintain a database nor administer surveys to students and alumni on a regular basis. Hence, administrative support in the form of reassigned time, administrative support staff time, and funding for conducting surveys (either through postal service or through online survey systems) is necessary. Without such support, the information is only 'point in time' data that is erratically gathered and with no system for benchmarking the dynamic changes that may be occurring within a program or monitor the progress and completion rates its student body.
Table 1
Summary of Results in Surveying the ID Master's and Doctoral Students

                                             Master's    Ph.D.
ITEMS                                        Program    Program

Type of Degree:                                 30         24
Gender:                                      Master's    Ph.D.
  Females                                       17         18
  Males                                         13          6
Race/Ethnicity:                              Master's    Ph.D.
  Caucasian                                     19         18
  African American                               8          1
  Hispanic                                       0          1
  Asian/Thai                                     1          4
  Unanswered                                     2          1
Area of Academic Concentration:              Master's    Ph.D.
  Performance Systems                            5          1
  E-Developer                                   12          0
  Individualized Program                         8          2
  Applied Research                               1          1
  Elementary Education                           0          3
  Counseling                                     0          1
  Nursing Education                              0          2
  Problem Based Learning                         0          2
  Materials Designer                             0          1
  Corporate                                      0          2
  Distance/Online/Web Based                      0          3
  Instructional Design                           0          1
  Special Education                              0          1
  N. A.                                          5          4
Area of Employment:                          Master's    Ph.D.
  K-12 Schools                                   1          2
  Higher Education (Advisor, Financial Aid       2         10
    Officer, Instructor, etc.)
  University Graduate Assistants                 2          0
  Administrator/Management                       2          1
  Business/Marketing/Accounting                  0          4
  Staffing Specialist                            1          0
  Military                                       4          0
  Designer/Trainer/Systems Specialist            2          2
  Health Professionals                           3          0
  Self-employed                                  0          2
  Government/Probation Officer                   1          1
  Automotive                                     1          0
  Lab Technician                                 1          0
  No response/not applicable                     9          1
Residence Location                           Master's   Doctoral
  Within a 75 mile radius of campus             26         21
  Elberta, AL                                    0          1
  Biloxi, MS                                     2          0
  Bush, LA                                       0          1
  Lexington, TN                                  1          0
  Memphis. TN                                    0          1
  Durham, N.C.                                   1          0
Undergraduate GPA                            Master's    Ph.D.
2.0-2.50                                         2          1
2.51-2.90                                        7          3
2.91-3.30                                        7          2
3.31-3.5                                         3          4
3.51-3.7                                         2          3
3.71-4.0

No response/Not applicable

Table 2
Summary of Results of Surveying of ID Allumni

Summary of Results of Surveying of ID Alumni.

                                        Average       Average
                                         Score         Score
ITEMS                                  Master's        Ph.D.

l. Degree:                                24            18

2. Yearly salary                      Median = 7    Median = 4

3.                                    8 <80 = 5     8 80 = 2
                                      7 70-79 = 1   7 70-79 = 3
                                      6 60-69 = 1   6 60-69 = 4
                                      5 50-59 = 3   5 50-59 = 3
                                      4 40-49 = 7   4 40-49 = 3
                                      3 30-39 = 2   3 30-39 = 1
                                      2 20.29 = 1   2 20-29 = 0
                                       1 >20 = 2     1 >20 = 0

4. When did you obtain your current      2.22          2.00
   job? (Before, While, or After      Before = 6    Before = 7
   your studies)                       While = 6     While = 3
                                       After = 11    After = 7

5. In general, how satisfied are         4.43          4.35
   you with your present job?            D = 1        VD = 1
                                        N/S = 2       N/S = 1
                                         S = 6         S = 5
                                        VS = 14       VS = 10

6. Is your current job related to       No = 5        No = 3
   your degree in ID?                  Yes = 18      Yes = 13

7. Did your degree in ID help you       No = 6        No = 6
   obtain our current job?             Yes = 17      Yes = 11

8.
7.1 Did your degree in ID help you      Yes = 8       Yes = 9
get a salary increased in your job?

7.2 Did your degree in ID help you      Yes = 7       Yes = 6
gain a promotion?

7.3 Did your degree in ID help         Yes = 10       Yes = 4
you get a better job with a new
employer?

7.4 Others (please specify)             Yes = 6       Yes = 2

9. Quality of Instruction                3.58          3.72

10. Course Content                       3.38          3.72

11. Course Scheduling                    3.04          3.33

12. Course Availability                  2.83          3.22

13. Equipment                            3.21          3.50

14. Computer and Computer Lab            3.29          3.39
    Access

15. Student Support Services                           3.06

16. Overall preparation for              2.83          3.44
    work/school after graduation
    from the ID program

17. Online course                        N = 7         N = 5
                                       M = 3.14      M = 3.60
18. In general, the ID program           5.00          5.22
    had (major / little) effect on
    how I function professionally

19. In general, the ID program           5.67          5.89
    (increased /did little to
    increase) my knowledge and
    skills in ID

20. All in all, I (would strongly /      5.33          5.56
    not) recommend ID to others

20.-25. were open-ended items             N/A           N/A
asking for comments on particular
statements such as

26. Other Comments: This last             N/A           N/A
item allowed individuals to add
other comments about the program

                                           Missing
ITEMS                                       Data

l. Degree:

2. Yearly salary                          Ph.D. = 2
                                         Master's = 2
3.

4. When did you obtain your current       Ph.D. = 1
   job? (Before, While, or After         Master's = 1
   your studies)

5. In general, how satisfied are          Ph.D. = 1
   you with your present job?            Master's = 1

6. Is your current job related to         Ph.D. = 2
   your degree in ID?                    Masters = 1

7. Did your degree in ID help you         Ph.D. = 1
   obtain our current job?               Masters = 1

8.
7.1 Did your degree in ID help you
get a salary increased in your job?

7.2 Did your degree in ID help you
gain a promotion?

7.3 Did your degree in ID help
you get a better job with a new
employer?

7.4 Others (please specify)

9. Quality of Instruction             Scale of 1 to 4
                                       4 = Excellent
                                       1 = Poor

10. Course Content                    Scale of 1 to 4
                                       4 = Excellent

11. Course Scheduling                 Scale of 1 to 4
                                       4 = Excellent

12. Course Availability               Scale of 1 to 4
                                       4 = Excellent

13. Equipment                         Scale of 1 to 4
                                       4 = Excellent

14. Computer and Computer Lab         Scale of 1 to 4
    Access                             4 = Excellent

15. Student Support Services          Scale of 1 to 4
                                        4 = Excellent

16. Overall preparation for           Scale of 1 to 4
    work/school after graduation        4 = Excellent
    from the ID program

17. Online course                     Scale of 1 to 4
                                        4 = Excellent

18. In general, the ID program        Scale of 1 to 6
    had (major / little) effect on      6 = major
    how I function professionally       1 = little

19. In general, the ID program        Scale of 1 to 6
    (increased /did little to          6 = increased
    increase) my knowledge and        1 = did little to
    skills in ID

20. All in all, I (would strongly /   Scale of 1 to 6
    not) recommend ID to others        6 = strongly
                                      1 = not recommend

20.-25. were open-ended items              N/A
asking for comments on particular
statements such as

26. Other Comments: This last              N/A
item allowed individuals to add
other comments about the program


Acknowledgement:

The authors would like to acknowledge Ms. Angelia Bendolph for her assistance in setting up the database structures using Microsoftr Access.

References

Adelman, S. I. (1995). Amarillo College: Tracking in a two-year college context. In P.T. Ewell (Ed.) Student tracking: New techniques, new demands. (pp. 31-42). New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

Aviles, C.B. (2000). Successful Collaboration between Student Affairs and Academic Affairs with a Graduate Follow-up Survey. Buffalo, NY: State University of New York. (ERIC Document Reproduction Service No. ED 466 707).

Borden, V.M.H. (1995). Harnessing new technologies for student tracking. In P.T. Ewell (Ed.) Student tracking: New techniques, new demands. (pp. 55-66). New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

Davidson, G. V. (1985, February). Specialization within the instructional design profession. Paper presented at the meeting of the Association for Educational Communications and Technology, at Anaheim, CA.

Davidson, G. V. (1987, February). Seven requirements to be a legitimate profession: What happens if the instructional design field doesn't make it? Paper presented at the meeting of the Association for Educational Communications and Technology, at Atlanta, GA.

Davidson-Shivers, G. V., & Barrington, M. (in final preparation). Revisiting the professional status of instructional design and technology and the specializations within. Manuscript to be submitted for publication.

Delany, A.M. (1995). Quality assessment of Professional Master's Degree Program. Boston, MA: The Association for Institutional research. (ERIC Document Reproduction Service No. ED 387 008).

Dempsey, J. V., & Van Eck, R. (2002). Instructional design online: Evolving expectations. In R. A. Reiser, & J. V. Dempsey, (Eds.), Trends and issues in instructional design and technology (pp. 281-294). Upper Saddle River, NJ: Pearson Education, Inc.

Ewell, P.T. (1995). Working over time: The evolution of longitudinal student tracking data bases. In P.T. Ewell (Ed.) Student tracking: New techniques, new demands. (pp. 7-20). New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

Ewell, P. T. (Ed.) (1995). Student tracking: New techniques, new demands. New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

Green, M.J. (1995). Tracking students who transfer: Electronic transcript exchange. In P.T. Ewell (Ed.) Student tracking: New techniques, new demands. (pp. 67-76). New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

ID Graduate Programs Report for Self-Study. (2001). IDD program report for the SACS self-study of the institution. University of South Alabama, Mobile, AL

Kern, R. P. (1990). A model addressing institutional effectiveness: Preparing for regional accreditation. Community College Review, 18(2), 23-28.

NCATE. (2004). National Council for Accreditation of Teacher Education Home Page. Retrieved March 16, 2004 from: http://ncate.org/faqs/faq_ncate.htm

Nerad, M., & Miller, D. S. (1996, Winter). Increasing student retention in graduate and professional programs. In J. G. Haworth (Ed.), Assessing graduate and professional education: Current realities, future prospects. New Directions for Institutional Research, 92, 61-76.

Porter, J.D., & Gebel, M.A. (1995). Arizona State University: Student tracking in a university setting. In P.T. Ewell (Ed.) Student tracking: New techniques, new demands. (pp. 21-30). New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

Richey, R. C., Fields, D. C., & Foxon, M. (2001). Instructional design competencies: The standards (3rd Ed.). Syracuse, NY: ERIC Clearinghouse on Information & Technology.

Rollman, S.A. (1991). Program Assessment as a Means To Improve Instruction and Enhance Faculty Development: The First Year of Assessment of Human Communication Majors at James Madison University. Atlanta, GA: The Speech Communication Association. (ERIC Document Reproduction Service No. ED 349 584).

Russell, A. B., & Chisholm, M.P. (1995). Tracking in multi-institutional contexts. In P.T. Ewell (Ed.) Student tracking: New techniques, new demands. (pp. 43-54). New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

Seppanen, L.J. (1995). Linkages to the world of employment. In P.T. Ewell (Ed.) Student tracking: New techniques, new demands. (pp. 77-92). New Directions for Institutional Research, no. 87. 17(3). San Francisco: Jossey-Bass.

Southern Association for Colleges and Schools. (2004). SACS home page. Retrieved March 16, 2004 from http://www.sacs.org/impages/abtsacsthe_southern_assocov.jpg

SPSS (1999). Statistical program for social sciences, version 9.0. Chicago, IL: SPSS Inc.

USA (2001). SACS institutional self-study report. University of South Alabama, Mobile, AL

Underwood, D.G., Nault, E.W., & Ferguson, L.S. (1994). Sometimes More Is Better: Development and Implementation of a Graduate Alumni Survey to Increase Response Rates and Evaluate Strategic Planning. New Orleans. LA: The Association for Institutional Research. (ERIC Document Reproduction Service No. ED 373 626).

Werner, (1983).

GAYLE V. DAVIDSON-SHIVERS, PH.D. KIT INPORNJIVIT, M.B.A. KIM SELLERS, M.ED.
COPYRIGHT 2004 Project Innovation (Alabama)
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2004 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Davidson-Shivers, Gayle V.; Inpornjivit, Kit; Sellers, Kim
Publication:College Student Journal
Geographic Code:1USA
Date:Dec 1, 2004
Words:4919
Previous Article:Some thoughts on the introductory course in physics.
Next Article:What do college examinations accomplish?
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters