Printer Friendly

The complexities of managing research projects: an ongoing study of developing a quality framework and measuring perceptions of service quality at UniSA.

Introduction

Australian education continues to be faced with a number of challenges as it strives to provide the nation with advanced knowledge and innovative research and development (Australian Vice-Chancellors' Committee, 2004). As a consequence of increased challenges and pressures, universities acknowledge that they belong to a 'market' that is becoming increasingly competitive. One area of university operations that has historically been overlooked in the "quality" forum is that of research. Currently, the Government measures are performance or "outputs" based, considering only successful higher degree research student completions, staff and student research publications and staff research income. There is, however, no opportunity for the research partner/client to provide "input" about the level of satisfaction with the experience.

In order to understand better the expectations of clients and to attain a superior competitive position, the University established an ISO9001 Quality Management system. The aim of this paper is to provide an overview of an approach to improving research management by measuring and streamlining the processes that support research project activity. It is not the intent of this paper to provide the prescriptive methodology used to survey clients.

The first section defines the principles, requirements and intent of ISO, while the second section explores how ISO9001 is applied and implemented at UniSA and, importantly, reveals how a formal management system has been key in driving improvement strategies through the development of quality performance measures in relation to the services provided to its external research clients.

What is ISO?

In the pursuit of competitive advantage, it is increasingly important to identify the demands and values of current and potential clients (Menzer, Flint, Kent, 1999). As we enter the 21st Century, it is imperative that we consider the complexities of our environment such as technology, globalisation, competition, change, speed of change and complexity itself (Tetenbaum 1998) as these factors contribute to the challenges of our organisational existence. If organisations, including University Research Offices accept these complexities and challenges, we must then address them by seeing knowledge, or the attainment thereof, as a prerequisite for sustainability. How do we best address these conditions and challenges and achieve competitive advantage? How do we give rise to a sustainable future?

It would be naive to suggest that ISO9001 is the complete answer; however, for UniSA, it does provide a formal management system and framework for identifying client requirements, setting organisational objectives, assigning responsibilities, managing human and material processes and monitoring the output of the system, including client satisfaction, with a view to continual improvement. This being the case, the formalised system enables controlled interaction with the environment in which we operate.

The ISO9000 model contains eight management principles designed to enable continual improvement. They are:

1. Client focus

2. Leadership

3. Involvement of People

4. Process Approach

5. Systems approach to management

6. Continual improvement

7. Factual approach to decision making

8. Mutually beneficial suppler relationships

Complementing these underlying principles is a series of requirements that need to be met in order to be certified (or registered, as it is often referred to in North America). They are:

1. Management Responsibility--Responsibility for the system rests with the 'top management' of the organisation, thus at a strategic level.

2. Resource Management--Sufficient human and physical resources are available to carry out the processes.

3. Product Realisation--There are controlled processes in place to support and manage products/service provisions.

4. Measurement, Analysis and Improvement--Strategies are in place that allow the system to be measured objectively and which allow for collecting information about how the system is performing in relation to client requirements.

Unlike many of the ISO standards, ISO9001: 2000 is a "generic" standard--that is it can be applied to any organisation, regardless of size or type. The model's four requirements function similarly to the PLAN-DO-CHECK-ACT (PDCA) improvement process that was popularised by W. Edwards Deming. It is a process approach; therefore, its framework.

(Figure 1.1) illustrates how client requirements drive the input and how client satisfaction drives the output. The process approach emphasises the importance of understanding and fulfilling the requirements of the client, the need to consider processes in terms of added value obtaining results of process performance and effectiveness, and continual improvement of processes based on objective measurement (Joint Technical Committee QR-008, 2000).

[FIGURE 1.1 OMITTED]

As illustrated, the process based quality management system shows the significant role that clients have in defining requirements as inputs. The continual improvement of a quality management system is derived from monitoring the satisfaction of clients, by evaluating information relating to their perception, as it is this that determines whether an organisation has met the requirements of their client (AS/NZS ISO, 2000). By deploying this framework and abiding to the requirements of ISO9001:2000, UniSA has developed processes, policy and procedures to ensure the following results:

1. Client/Industry Partner requirements are defined and documented ensuring alignment between client expectation and UniSA's perception of that expectation.

2. Project Management processes are developed to ensure the clients' requirements are fulfilled.

3. Client feedback is obtained at the end of every research and consultancy project as well as via an annual survey as described in the Service Quality Perceptions part of this paper.

4. Objective measurement of client feedback and process effectiveness is used to assist with decision making, leading to continual improvement and superior competitive positioning.

Because quality is critically linked to an organisation's success (Buzzell & Gale, 1987; Gronroos, 1990; Howat, Milne & Crilley, 1996), UniSA has developed a framework which includes a suite of tools and systems for managing its research and consultancy projects. The aim of these is to foster process consistency and to learn more about our clients.

To be able objectively to measure client satisfaction one must have a consistent approach. To this end, the Research and Innovation Services Office has developed policy and procedures for the project management of research and consultancy, along with mechanisms for the continual review and improvement of its processes. This approach is consistent with Johnson's (1993) concept of ISO quality, suggesting that ISO9000 is focused on meeting client needs with a system that is appropriate, planned, controlled, documented and fully understood. Additional information may be found at http://www.unisa.edu.au/res/busadmin/default.asp

At a functional level, a reliable approach to the management, tracking and recording of research and consultancy projects has been achieved through the development of a web-based project management system called the Project Quality System (PQS). The PQS enables projects to be managed from proposal stage through to project completion as its process tracks and records client details, intellectual property opportunities, risk assessment, capacity approval, budget entry, and client feedback. It is this latter component, client feedback, which is most important as it enables us to assess how the services that we are delivering services are valued by our clients.

Service Quality Perceptions--The Clients' Perceptions

The initial question to be addressed therefore is "What do our clients value or want in their interactions with UniSA's research services?"

Beginning in 2001, a portfolio of service attributes was identified by conducting two focus groups with external clients. The service quality issues shaped the development of a self -administered questionnaire (see Appendix A), that was subsequently, piloted with a sample of clients from the PQS database. The self-administered questionnaires use a tailored service specific version of SERVQUAL, a conceptual service quality model able to facilitate the monitoring of clients service quality expectations and performance (Parasuraman, Zeithaml & Berry, 1985 and 1988).

The adaptation of SERVQUAL is dependent on two variables: expected service and perceived service. The two variables are compared so that the "perceived service quality" is interpreted from the differences in degree and direction between perceptions and expectations. For example the service quality attribute for "employee enthusiasm" in 2003 had an importance rating of 5.1 and a performance rating of 4.9, resulting in a service quality gap of-0.2. The smaller the gap the better the alignment between the two variables--expected service and perceived service.

The inaugural survey study of external clients, conducted in March 2001, netted a response rate of approximately 40%. Subsequent factor analysis on the 2001 data set led to improvements to the questionnaire, which included the identification of three industry specific dimensions of service quality. These were categorised as "product/service delivery," "human resources" and "assurance and reliability."

Each year since, the Research and Innovation Services Office has contracted an independent research centre, specialising in these types of studies, to survey our clients under the guidance of a pre-determined set of criteria. The response rate during the last four years has ranged from 34 to 40 per cent.

The first two annual surveys examined only UniSA's service delivery. The study has since extended to include the Australian Technology Network Universities (ATN), of which comprises UniSA, Curtin University Perth, Queensland University of Technology (QUT), University of Technology Sydney (UTS) and the Royal Melbourne Institute of Technology (RMIT) are members. The perceived advantage of extending the survey to several comparable institutions is that it enables benchmarking and identifies areas of best practice. Each participating organisation receives an individual report with their results and a separate section in which these are compared to the benchmark average.

This paper now examines a small sample of survey results which are considered to be key indicators listing areas of strength and areas that require monitoring or attention. Furthermore, it explores an area that was highlighted as "requiring further attention" and explains how the process was identified and improved.

Using a seven-point scale with a measurement range from Very Dissatisfied to Very Satisfied, survey respondents are requested to rate their levels of satisfaction. The attribute, (Figure 1.2) is a summary attribute that measures overall satisfaction.

The measurement above illustrates that overall satisfaction with UniSA has increased (6.00 in 2004 from a maximum of 7.00) from the previous surveys (5.91 & 5.65). Achieving 86% from a maximum score of 100% (i.e. 6.00 from 7.00) is considered a strength as overall satisfaction has increased by a total of 6% over a three year period and is above the ATN benchmark. It could be argued that a 6% improvement over three years can at best be considered small, however the validity of this argument would lose its strength when it can be seen that UniSA is achieving 81% from a maximum of 100% in 2002 and 84.4% from a maximum score in 2003. In other words UniSA believes that as it is scoring close to the available maximum available result it is difficult to achieve large percentage improvements within such a limited available scope.

Similarly, another positive summary attribute (Figure 1.3) is that of recommendation to others. The tool used to measure the extent to which clients would recommend UniSA to others is a five point scale that ranges from Strongly Not Recommend to Strongly Recommend.

Aggregating the UniSA scores for strongly recommend and recommend it can be seen that UniSA has achieved high levels of recommendation, which have remained consistent each year with results of 88% in 2002, 89% in 2003 and 89% in 2004. Furthermore through benchmarking with the ATN Universities UniSA is above the total recommendation benchmark score, which is further evidence that the service provision for this attribute is at a level where clients are willing to recommend UniSA to others.

The scale used in this part of the questionnaire ranges from 1 (disagree) to 6 (strongly agree). The importance mean refers to the extent to which respondents believe the particular service attribute is important to them. The performance means measure how the service attribute is perceived to be performing. These two means are used to calculate the client service quality (CSQ) gap for each attribute. By requesting respondents to rate their levels of importance and performance in relation to attributes of service quality, the 2004 survey highlighted a number of attributes that may be considered as competitive strengths, these are shown in table 2.1.

The benefits of these results are a confidence amongst researchers and research administration staff that their services are valued by clients; use in external marketing; and knowledge of what processes can be maintained and which need improvement.

An area identified in the 2003 survey as needing further consideration was that of "administrative processes." As the nomenclature suggests, this question focused on measuring clients perceptions of administrative support processes (legal. finance, ethics) of the research project using a scale from 1 (disagree) to 6 (very strongly agree). In 2003 the results of the survey identified a comparatively large gap between clients' importance ratings and their perceptions of UniSA's performance. The gap of-1.0 (importance 4.9 and perceived performance 3.9) was larger than the ATN benchmark and was also larger than that recorded in the previous year, which was -0.7. The relatively high importance rating of 4.9 reinforced that clients considered this aspect of service quality to be integral to the research project's success.

To better understand "administrative processes" an external focus group study was conducted with the aim of obtaining detailed feedback from clients who had engaged in research projects. Using a "storybook" approach, participants were asked the following nine questions and were encouraged to write responses on cards and discussion was facilitated to generate more ideas and elaborate on key points:

1. When engaging in research projects with UniSA, what in general were the areas in which problems were experienced?

2. When you commenced a research project did you experience any administration problems?

3. In relation to these administration problems, what should UniSA improve and how?

4. Does your organisation have any "best practice" administration principles that UniSA might adopt?

5. How can you as a client help UniSA to deliver products and services on time and according to specification?

6. What would your organisation like to be asked by UniSA staff at the conclusion of a project?

7. What should be the feedback mechanism(s) for your organisation to provide suggestions back to UniSA staff after a project is completed?

8. How many feedback mechanisms should there be?

9. Any other thoughts/experiences?

Free text responses from the storybook methodology included: financial monitoring takes a number of calls to sort out, incorrect invoicing; financial processes are not timely especially the production of invoices; lack of clarity about processes for payment in joint research projects; financial arrangements not specifically established. Responses to these questions identified that financial processes used to support research and consultancy projects needed urgent consideration.

Furthermore free text responses from the survey (e.g. financial monitoring takes a number of calls to sort out; incorrect invoices) had reinforced this aspect of service quality as one requiring immediate attention.

To address the area of financial processes, an internal review was conducted on the processes that support the financial management of projects. A number of interviews were conducted with senior administrative staff and active researchers. Findings from these interviews were documented and reviewed with the aim of identifying common issues and opportunities.

Based on the findings, it became evident that the processes relating to invoicing correctly and on time needed immediate action. To improve the existing and inadequate processes, it was agreed in consultation with relevant stakeholders that all financial monitoring and invoicing would be standardised using the PQS milestone facility to manage financial matters. This initiative was chosen as the PQS is able to automatically remind appropriate staff via email that financial action needs to be taken based on pre-entered project deliverable and financial schedules. To ensure that staff act upon the PQS financial milestone invoice emails, staff in the Research and Innovation Services Office develop regular ongoing reports from the PQS database to monitor financial milestone completion and, where necessary, take appropriate action to ensure scheduled milestones relating to financial matters are satisfied.

In an effort to better comprehend and track financial processes, the 2004 survey instrument was modified whereby the "administrative processes" attribute was removed and replaced by two questions: one focusing on "financial processes" and the other on "legal processes." The results of the 2004 survey project show that the gap (importance v performance) is smaller than the gap for the previously included "administrative processes." Future surveys will assist in reinforcing these results; however, preliminary investigations suggest that 2004 survey respondents have had a better experience in relation to this aspect of UniSA's service provision.

Conclusion

Prior to 2001, there existed no systematic way for Australian higher education research managers to measure the level of service quality provided to their clients. With increasing expectations from industry clients, research management processes have become more complex, and risks have increased as the pressure for quick turnaround times for contracts and deliverables grows. The development of a formal ISO9001:2000 quality framework and the ongoing measurement of client satisfaction has enabled these challenges and complexities to be addressed by providing a frame of reference for organisational learning, whilst, at the same time, providing clients with assurance that their requirements for products/services will be met and delivered conforming to specification under a globally accepted quality standard (http://www.isixsigma.com/library/ content/c000917b.asp).

UniSA is now able to identify objectively its strengths, and to address areas requiring attention as perceived by its external clients on an ongoing basis. Through this measurement, services can be improved by reviewing and, where appropriate, re-engineering the processes that support the management of research and consultancy projects.

The journey is just beginning. Building on what has been learned, it is understood that these findings are not conclusive, nor ends in themselves, but rather an ongoing study of client feedback and processes that support research and consultancy project management. Whether future results reinforce what is already learned remains to be seen. In either case, (confirm or refute), the ongoing study of process efficiency, effectiveness and client satisfaction gives rise to opportunities for improvement.

The survey discussed in this paper is an ongoing cost-effective source of manager and decision maker friendly information that enables us to better understand the service perceptions and expectations of our clients. Competitive advantage or superior market positioning is thus achieved through this identification of client values and demands.

References

Australian Vice-Chancellors' Committee (2004). Pursuing the Vision for 2020. Election 2004: The Next Challenges for Universities. Retrieved 7 July 2004, from http://www.avcc.edu.au/

Buzzell, R.D. & Gale, B.T. (1987). The PIMS Principles, Linking Strategy to Performance. New York, NY: The Free Press.

Gronroos, C. (1990). Service Management and Marketing. Lexington, MA: Lexington Books.

Howat, G., Milne, I. & Crilley, G. (1996). 'Monitoring customer service problems and their resolution for leisure services'. New Zealand Recreation Association, Annual Conference

Proceedings, Palmerston North, New Zealand, November 1996.

Johnson, EL., (1993). ISO9000--Meeting the New International Standards. New York, NY: McGraw-Hill Inc.

Joint Technical Committee QR-008. (2000). AS/NZS ISO9004:2000 Quality Management Systems--Guidelines for Performance Improvements. Sydney, Australia: Standards

Australia International Ltd, and Standards New Zealand.

Mentzer, J.T., Flint, D. J. and Kent, J. L. (1999). Developing a logistics service quality scale. Journal of Business Logistics, Vol. 20, No. 1, pp. 9-32

Parasuraman, A., Zeithaml, V.A., & Berry, L.L. (1985). 'A Conceptual Model of Service Quality and its Implications for Future Research'. Journal of Marketing, Vol 49, pp. 41-50.

Parasuraman, A., Zeithaml, V.A., & Berry, L.L. (1988). 'SERVQUAL: a multiple-item scale for measuring consumer perceptions of service quality'. Journal of Retailing, Vol 64, pp. 12-40.

Six Sigma Library, ISO9000--The Benefits of ISO Certification. Retrieved 12 March, from http://www.isixsigma. com/library/content/c000917b.asp

Tetenbaum, T. (1998). 'Shifting Paradigms: from Newton to Chaos'. Organisational Dynamics. Spring, pp. 21-32.

Turban, E., McLean, E. and Wetherbe, J. (2004), Information Technology for Management: Transforming Organisations in the Digital Economy, 4th Rd, John Wiley & Sons Inc, New York, ISBN 0-471-22967-9.

University of South Australia (2005). Research at UniSA. Retrieved 18 June, from http://www.unisa.edu.au/research/ researchatuni/default.asp

Mark Gorringe: Masters in Administrative Management (in progress)

Mark Hochman, PhD

University of South Australia

Research and Innovation Services

Mawson Lakes Boulevard

Mawson Lakes, South Australia, 5095

Tel: 08 83025143/Fax: 08 83023921

Email: mark.gorringe@unisa.edu.au

mark.hochman@unisa.edu.au
Table 2.1 Importance and Performance Ratings.

CSQ Attributes Importance Performance CSQ Gap

Employee knowledge and experience 5.3 4.9 -0.4
Key contact person clearly
identified 5.4 5.2 -0.2
UniSA's working knowledge of
industry requirements 5.2 4.9 -0.3
Assuring trust and confidentiality 5.3 5.1 -0.2
Flexible Approach 5.2 4.8 -0.4

Figure 1.2 Overall Satisfaction with UniSA's Research Services

 UniSA ATN Group

2004 6.00 5.75
2003 5.91
2002 5.65

Maximum Rating = 7

NOTE: Scale used for this question ranged from 1 'very dissatisfied'
to 7 'very satisfied'

Note: Table made from bar graph.

Figure 1.3 Positive Recommendation Levels of UniSA's
Research Services.

 2004 2003 2002

UniSA
recommend 53% 52% 50%

UniSA
strongly 36% 37% 38%
recommend

ATN Group 82%

UniSA
total 89% 89% 88%
recommendation

`Undecided' introduced in the scale in
2002 (11%) was chosen by 10 percent of
respondents in 2003 and 11 percent in

Note: Table made from bar graph.
COPYRIGHT 2006 Society of Research Administrators, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2006, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

 Reader Opinion

Title:

Comment:



 

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:ISO9001: 2000
Author:Hochman, Mark
Publication:Journal of Research Administration
Geographic Code:8AUST
Date:May 1, 2006
Words:3537
Previous Article:Why do ethical scientists make unethical decisions?
Next Article:Creating a multi-use building for a research center: a management and operations case study and critique.
Topics:


Related Articles
SPECIAL ISSUE: MAINTAINING AND DEVELOPING QUALITY IN INSTITUTIONS OF HIGHER EDUCATION IN THE 21ST CENTURY.
Professional development and quality in higher education institutions of the 21st century.
Linking Outcomes Assessment with Teaching Effectiveness and Professional Accreditation.
Introduction.
Perspectives on User Satisfaction Surveys.
Service Quality: A Concept Not Fully Explored.
Measuring Service Quality in the Networked Environment: Approaches and Considerations.
Theory application for online learning success.
Preparing for ISO 9000.

Terms of use | Copyright © 2014 Farlex, Inc. | Feedback | For webmasters