Printer Friendly

Rejoinder: the practitioner-researcher divide revisited: strategic-level bridges and the roles of IWO psychologists.

'In the final analysis the progress of psychology, as of every other science, will be determined by the value and amount of its contributions to the advancement of the human race' (Witmer, 1907: pp. 3, cited in Viteles, 1933).

Debates over relationships between science and practice have been around for a long time in applied psychology. Such constructive debate is, however, a sine qua non of a climate for fundamental advances in both research and practice in any science-based professional discipline. Debate forms a key bridge between academic scholars and practitioners. It stimulates mutual reflexivity; the sedimentary issues carried along on its flow are causes for professional introspection. In short, it is debate itself that is indicative of a narrowing of any gap between the scientific and practitioner wings in all knowledge-based disciplines.

The recent multiple author contributions in the June 2006 issue of JOOP by Gelade Wall, Symon, and Hodgkinson (all 2006) over the role of the journal in bridging any divide between researchers and practitioners in Industrial, Work and Organizational (IWO) psychology is therefore to be particularly welcomed. All authors make valid, constructive and noteworthy contributions to this debate, in this context with particular regard to the role of JOOP as a medium for information exchange and dissemination between researchers and practitioners in our field. This set of papers followed on from earlier debates in JOOP and other journals (e.g. Anderson, Herriot, & Hodgkinson, 2001; Arnold, 2004; Hodgkinson, Herriot, & Anderson, 2001). Since this time, however, the debate has moved on substantially. It has received far wider attention, not just in the UK but internationally (e.g. Rynes, Bartunek, & Daft, 2001), and an even wider diversity of views has been expressed over the existence of such a divide (Rousseau, 2006; Van de Ven & Johnson, 2006), whether it is widening or narrowing over time (Hodgkinson, 2001; Tranfield & Starkey, 1998), whether any divide is purely negative or might hold positive facets also (Anderson, 2005), and how best IWO psychologists can build bridges between research and practice (e.g. Hyatt, Cropanzano, Finder, Levy, Ruddy, Vandeveer, & Walker, 1997). These issues clearly extend well beyond the bounds of the remit that JOOP can fulfil. Indeed, identical issues have emerged in other journals, whether the journal is primarily scientific in orientation or more of a practitioner newsletter.

In this short rejoinder and response to Gelade, Wall, Symon and Hodgkinson, I argue four points. First, that JOOP (and other scientific journals) can only serve its role satisfactorily if papers report sufficient methodological and analytical detail to allow for the scientific replication of reported findings, but that this should not in any way preclude consideration of practical implications. On this point, I therefore find myself paradoxically in some disagreement with Anderson (1998), a rather curious position both intellectually and psychologically. Second, I identify six prominent 'types' of research in IWO psychology--pure, fundamental, applied, action research, consultancy-generated research and critical theory musings. That a gap exists between research and practice is to be expected in any complex, diversified and specialized field. This is not the problem, but rather the lack of integrating processes, bridges for information exchange and bidirectional policy formulation in both research and practice, is a notable problem. Third, on the point of bridging mechanisms, I note the six such bridges identified by Hyatt et al. (1996), but identify seven other, more strategic-level bridges that we should be actively pursuing. These are government commissions and working parties, boards of directors and industry commissions, research council involvement, conferences and other fora, research consortia, editorial board memberships for practitioners and a consultancy sponsored strategic fund. Fourth and finally, I return to the role of JOOP within this wider context. I argue that the journal is more than meeting its remit, that while it is beholden upon researchers to highlight ramifications for organizational practice, it is also the professional responsibility of practitioners to interpret and extrapolate from published findings. On all four points, I argue that we need to reframe the debate away from the transfer of knowledge within the profession between psychologists, towards more externally focused issues of the wider strategic influence of both the science and practice of the field of IWO psychology.

Gratuitous complexity and the principle of scientific replication

At the heart of Gelade's (2006a) eloquent critique and his subsequent response (Gelade, 2006b) to the three replies, are two fundamentally challenging issues that (i) scientific papers in JOOP contain far too much detail over the methods and analytical procedures employed and (ii) as a partial consequence, far too little space is devoted by authors to the practical, organizational policy implications which arise from empirical study findings. This combined point is in fact commensurate with Anderson's (1998) concept of 'gratuitous complexity' in research in the wider organization sciences. Back then I argued that excessive conformity to the conventions of 'pedantic science' (as we subsequently labelled it) was leading to a "trend towards construct and analytical complexity for the sake of esoteric exclusivity in mimicking the jargon and quantitative methods in the physical sciences" (p. 325). Phew! In translation to plain English--that researchers were having to be too clever by hall for their own good, and that this was leading inevitably to the marginalization of research away from practice and vital policy formation. In truth, I expressed this back then rather more directly: "Managers regarding the premier academic journals in the field as being unreadable, banal and inconsequential' (p. 325).

I could not have been more mistaken. Organization science research is, by its very nature, complex, multifarious, conceptually challenging and therefore needing of sufficiently complex methodologies and analytical strategies to generate meaningful 'pragmatic science'. These are not mere points of detail that can be jettisoned to a footnote (or an electronic addendum). Quite the opposite, in fact. A founding pillar of scientific investigation is that published theories, models and empirical findings are capable of replication or falsification (Feyerabend, 1980; Kuhn, 1970; Pfeffer, 1993). Studies are therefore required to report their methods, analysis procedures and findings in sufficient detail to allow other researchers to run identical, or replication-extension studies (Sackett & Larson, 1990). Indeed, such 'details' are absolute prerequisites and are enshrined in the guidelines for authors and publication manuals of all top-tier journals in the organization sciences (e.g. APA Publication Manual, 2001). This we might term the 'Principle of Scientific Replication' (and this is not gratuitous complexity). It is an indispensable cornerstone for the future progress of research and a feature that distinguishes IWO psychology from spurious, ungrounded trends and fads so apparent in other areas of organizational consultancy and human resources management practice. So is the publication of measures and scales in any quantified discipline, a point that Garry Gelade controversially argues should not be the role of JOOP I cannot agree. In fact, this is a service to research and it is a vital role of any scientific journal, although disappointingly this has become less prevalent in several journals over recent years. As a crucial contribution to the 'tools of the trade' for research, most scholars with a quantitative bent would view this function of any journal as being highly valuable, even if scale measures are simply reported as part of a wider empirical study.

A final point of retort here. The reporting of methodology and analytical procedures does not preclude coverage in the discussion section of equally important issues concerning implications for practice. Overviewing the past few years of issues of the journal, I was struck by the fact that the overwhelming majority of papers published cover practical implications in quite some detail (far more than in some other 'academic' journals). The single paper that Gelade vilifies as being practitioner unfriendly in this respect is Patterson, Warr, and West (2004). Yet, my rereading of this paper as a scientist who also independently advises organizations and consultancies on best practice led to the opposite conclusion. I felt that the authors had noted several higher level issues for (potential) practice, not an easy task in a longitudinal study into causal links between organizational climate and productivity. The authors were rightfully cautious in being guarded over their claims for immediate and micro-level practical implications given the sheer breadth of their study. Then, there is also the question of the onus of professional responsibility--is it up to individual researchers to attempt to spell out in detail the myriad of possible implications for practice in a huge and unknown variety of settings? I would suggest not, indeed it would be most unwise to do so. I would, conversely, suggest that organizations employing consultants would rightfully expect these expert advisors to be willing and able to extrapolate from a literature search of all published studies relevant to the particular assignment in question.

Bridging the divide: Embracing diversity and specialization

It seems axiomatic that we should be genuinely concerned over the widening of any gap between science and practice, yet I have argued most recently that it is not actually the width of the gap that we should be overly preoccupied with, but rather, the lack of sufficient bridging mechanisms to span the two sides of any chasm (Anderson, 2005). Indeed, I argue that the relationship between research and practice should itself constitute a new 'process domain' for investigation and that a 'natural distance' quite reasonably exists between the two. Before being committed to trial for professional heresy, the corollary to the latter point is that strong bridging mechanisms and bidirectional fora for exchange need to be maintained if our discipline is to flourish. Symon (2006) criticizes our earlier paper for not devoting space to defining 'practical relevance', an issue we have in point of fact discussed at some length elsewhere (Anderson, 1998, 2005; Hodgkinson, et al., 2001); but my point here is categorical and simple: width does matter, but not nearly as much as bridging (a point made similarly by Wall, 2006).

Having disagreed with, and potentially alienated, colleagues who are either scientists and practitioners or both, permit me to take this line of argument just a little further. An indicator of a healthy knowledge-based profession is that there is a gap between research and practice, at least at the extremes of 'blue-sky' research at the one extremity and routinized administrative practice at the other. Behavioural perspectives in modern organization science and consultancy practice, including IWO psychology, are too broadly dispersed to warrant anything else (Pfeffer, 1993; Rousseau, 2006). This gap is justified, historically defensible and necessary for the future success of our field (Anderson, 2005). I would argue that six major types of research are in evidence within IWO psychology, that each has resulted in publications in JOOP (see examples in parentheses below), and that each makes a different contribution to our discipline, as follows:

(1) Pure research. Theory building, blue-sky modelling, critiques of the dominant 'Zeitgeist', future-oriented reviews of the directions for further research (e.g. Allen & Hecht, 2004).

(2) Fundamental research. Meta-analyses of cause-effect relations, review articles, experimental studies into behavioural outcomes, cross-national studies of cause-effect relations or procedural outcomes (e.g. Bertua, Anderson, & Salgado, 2005).

(3) Applied research. Primary empirical studies, replication-extension studies in single or multiple host organizations (typically the preponderance of published papers in journals such as JOOP), theoretically based empirical studies (e.g. Patterson et al., 2004).

(4) Action research. Organization-based case studies, attitude survey and intervention projects that generate publishable findings (e.g. Fletcher, 1991).

(5) Consultancy-generated research. Serendipitous opportunities for data collection generated during within-organization consultancy projects, new product developments that with re-analysis may be publishable (e.g. Silvester, Anderson-Gough, Anderson, & Mohamed, 2002).

(6) Critical theory musings. Cross-paradigm critiques, post-modernist reformulations and calls for the abandonment of 'positivistic' methodology in IWO psychology (e.g. Dick & Nadin, 2006).

These categories are clearly not mutually exclusive, and the examples given are only illustrative of the types of research underway in our field. It is also sensible not to be prescriptive over the proportions of research within each of these six categories for our discipline to remain healthy both as a science and as knowledge-based practice. However, Pfeffer (1993) warned against the wholesale adoption of critical approaches, arguing persuasively that the growth of their popularity amongst (American) Academy of Management researchers had created a detrimental rift within the discipline and that this had regrettably undermined still further the perceived value of research findings amongst management practitioners (for a counter-perspective in the UK, see Symon & Cassell, 2006). It seems to me that the health of our discipline is far from assured and that we ignore Pfeffer's case for practically relevant science at our peril.

Bridging the divide: Bridging mechanisms and stakeholder groups

On the basis of a panel discussion held at the 1995 SIOP conference, Hyatt et al. (1996) compiled a list of the following six main bridging mechanisms available to IWO psychologists:

(1) Technology. Use professional society web pages, notice-boards and other electronic media.

(2) Invited addresses. Inviting practitioner speakers to give presentations in academia.

(3) Sabbaticals in industry. Academics taking their sabbaticals in commercial organizations.

(4) Practitioner involvement in graduate education. Beyond invited addresses, to use practitioners as regular instructors.

(5) Practicum projects. Internships, placements and periods of supervised work experience.

(6) Mutual research groups. Combined research projects involving academics and practitioners.

Many of these bridges are being widely used in European universities already, I would argue. But above and beyond this list, there are several perhaps even more important strategic-level bridging mechanisms that should be included and considered. I identify seven such strategic bridges as follows:

(7) Government commissions and working parties. IWO psychologists should set their sights considerably higher and more ambitiously than just influencing each other. Advisory roles on such governmental bodies also include commissions for professional practice, protection of the public good and anti-discrimination bodies where our knowledge is directly relevant but notably underrepresented presently.

(8) Boards of directors and industry commissions. Similarly, we should become more strategically involved in senior managerial decision-making and corporate governance. Applying our knowledge at this level gives genuine board-level influence, yet notably few IWO psychologists are included in such top-level fora.

(9) Research council involvement. IWO psychology competes against multiple other disciplines for funding and has often been underrepresented on central funding allocation committees over the years. Less funds begets less top-level research, fewer rising star scientists, less influential research, perceptions externally of being a quasi-science and thus eventually a vicious circle of underfunding and underachievement.

(10) Conferences, CPD events and keynote addresses. Knowledge transfer in both directions often needs a face-to-face stimulant, and here our whole structural use of annual conferences, continuing professional development events, and keynotes by academics and practitioners should be reviewed for comprehensiveness, efficiency and value added.

(11) Research consortia. Establish industry-university research linkages in a far more formal and structured way. Multiple (non-competing) organizations can be included in research consortia led by a university research group, allowing far larger scale and impactful research projects to be undertaken.

(12) Editorial board memberships. Involve practitioners on the editorial boards of journals, or as we did when establishing the International Journal of Selection and Assessment, include a practitioner-researcher forum expressly for the fast track publication of action research or consultancy-generated studies. All papers can then be commented upon for how the implications for practice section have been handled.

(13) Consultancy sponsored strategic fund. A small percentage levy on the profits of all consultancies in our field having a certain turnover to go into a central development fund for strategic research, influencing government legislation and dissemination of scientific research findings for public consumption.

All of these suggestions are at a strategic level of influence, are purposely rather provocative in some instances, but are also intended to highlight the case that we need to widen the focus our message to not just fellow psychologists but to far more important external stakeholder groups and governmental bodies. Efforts in this direction have been improving rapidly in several countries (most notably the UK, the Netherlands, Spain and South Africa), but my point is simply that our bridging mechanisms need to be thought of as externally driven, not merely as work psychology researchers communicating their findings to fellow psychologists working in consultancy settings. Precious little progress has been made to date upon exploiting the most strategic and high-level bridging mechanisms I propose above (especially government influence and psychologists serving on main boards of directors). Precious little progress has been made In establishing multiple 'process domain' interfaces between science and practice as topic areas worthy of research in their own right. Indeed, the psychology of the recipient of psychological knowledge (usually HR practitioners or line managers) has been notably under-researched. So, perhaps we should better be asking how can JOOP serve as a bridging media in each of these arenas as a journal representing what useful knowledge IWO psychologists can bring to the table in such strategic settings and contexts?

A final plea is warranted at this point. If the width of the gap is good, then the strength of our bridging mechanisms is absolutely vital and will be determined by two points in particular. First, these processes need to be bidirectional, that is, robust research influencing professional practice in the one direction, and simultaneously, cutting-edge practice driving and stimulating innovative research in the other. Elsewhere, I give examples from the history of our discipline of where these channels have been either functional (e.g. robust research informing professional practice, trends in practice generating novel research) or dysfunctional (e.g. robust research failing to influence practice: Anderson, 2005). Second, such a span of activities brings the need for empathy, understanding and a mutual appreciation of the value that practitioners and researchers alike bring to the discipline. We need to reflect critically upon the ways in which excellent practice can influence cutting-edge research, and vice versa. At the same time as we noted in our original 2001 paper in JOOP, we need to be mindful of the drift that has occurred in recent years into early career specialization into either practice or research in IWO psychology and the declining opportunities throughout career to be able to move back and forth in either direction. We are quite some distance, it seems, from the optimum position where all these bridging mechanisms are working as well as they could be and that information is being routinely exchanged in both directions between scholars and practitioners.

The role of JOOP: Everything to everybody?

The essential argument advanced by Gelade was that JOOP is failing to meet expectations, especially practitioner expectations, over the reporting standards and consideration of practical implications in papers accepted for publication. This might lead, it is argued, to a widening of the divide by practitioners beginning to ignore the journal or indeed regard it as irrelevant from their perspective. As a single scientific journal in our field, is this truly a fair charge? Certainly, the wholesale incorporation of the gamut of modifications recommended by him would effectively reorientate the journal substantially and would undoubtedly result in a very different medium. It would stand in stark contrast to the stance advocated by Hodgkinson (2006), with whom I agree, that JOOP is pre-eminently a scientific journal, whose audience is drawn from organizational research circles, and should remain as such for the foreseeable future. As such, a far more pressing question is whether JOOP is doing all it could in facilitating IWO psychology and psychologists to become better ensconced in the strategic-level fora noted above?

Other newsletters and reviews of key research findings exist, it can be argued, which better meet the crucial need for practitioner information and summary findings from research, but there is a valid point here: how do we flag the process notes underlying research that are often so interesting and yet unavoidably become lost in the review and publication process? Much as a caption note can be added to an electronic document, I do agree with Garry Gelade that these often provide highly compelling and valuable insights to what lies behind the sanitized, published research report. Overall, however, I would argue that the journal is fulfilling its function, that it is widely read, that it has increasing citation indices, that it is influencing both research and practice and that it constitutes one of the most valuable bridges we have for information exchange between researchers and practitioners in IWO psychology. The debate should really be whether JOOP is fulfilling its role with regard to the seven strategic-level bridges I identified earlier and how the journal can be made more effective in meeting these higher order goals.

After agreeing in general with Gelade, Wall, Symon and Hodgkinson, but disagreeing in general with myself and with each of them on specific points, let me leave the final, telling word to Neal Schmitt (personal correspondence, September 2004). Upon sending him a copy of my chapter titled 'Relationships between practice and research in personnel selection: Does the left hand know what the right is doing?' he replied by e-mail with the telling simplicity of a single word: 'No'! Who said academics made things gratuitously complex?

Received 31 October 2006; revised version received 7 February 2007


Allen, N. J., & Hecht, T. (2004). The romance of teams: Toward an understanding of its psychological underpinnings and implications. Journal of Occupational and Organizational Psychology, 77, 439-461.

American Psychological Association (2001). APA Publication Manual (5th ed.).

Anderson, N. (1998). The people make the paradigm. Journal of Organizational Behavior, 19, 323-328.

Anderson, N. (2005). Relationships between practice and research in personnel selection: Does the left hand know what the right is doing? In A. Evers, N. Anderson, & O. Voskuijl (Eds.), The blackwell handbook of personnel selection (pp. 1-24). Oxford: Blackwell Publishing.

Anderson, N., Herriot, P., & Hodgkinson, G. P. (2001). The practitioner-researcher divide in Industrial, Work and Organizational (IWO) psychology: Where are we now, and where do we go from here? Journal of Occupational and Organizational Psychology, 74, 391-411.

Arnold, J. (2004). Editorial. Journal of Occupational and Organizational Psychology, 77, 1-10.

Bertua, C., Anderson, N. R., & Salgado, J. (2005). The predictive validity of cognitive ability tests: AUK meta-analysis. Journal of Occupational and Organizational Psychology, 78, 387-409.

Dick, P., & Nadin, S. (2006). Reproducing gender inequalities? A critique of realist assumptions underpinning personnel selection research and practice. Journal of Occupational and Organizational Psychology, 79, 481-498.

Feyerabend, P. (1980). Against Method. London: Verso.

Fletcher, C. (1991). Candidates' reactions to assessment centres and their outcomes: A longitudinal study. Journal of Occupational Psychology, 64, 117-127.

Gelade, G. A. (2006a). But what does it mean in practice? The Journal of Occupational and Organizational Psychology from a practitioner perspective. Journal of Occupational and Organizational Psychology, 79, 153-160.

Gelade, G. A. (2006b). Response to commentaries: Wider and wider. Broadening the readership of the Journal of Occupational and Organizational Psychology. Journal of Occupational and Organizational Psychology, 79, 179-181.

Hodgkinson, G. P. (Ed.). (2001). Facing the future: The nature and purpose of management research reassessed. British Journal of Management, 12 (special issue) 1-80.

Hodgkinson, G. P. (2006). Commentary: The role of JOOP (and other scientific journals) in bridging the practitioner-researcher divide in industrial, work and organizational (IWO) psychology. Journal of Occupational and Organizational Psychology, 79, 173-178.

Hodgkinson, G. P., Herriot, P., & Anderson, N. (2001). Re-aligning stakeholders in management research: Lessons from industrial, work and organizational psychology. British Journal of Management, 12(special issue), 41-48.

Hyatt, D., Cropanzano, R., Finder, L. A., Levy, P., Ruddy, T. M., Vandeveer, V. & Walker, S. (1996). Bridging the gap between academics and practice: Suggestions from the field. Industrial-Organizational Psychologist, 35, 29-32.

Kuhn, T S. (1970). The structure of scientific revolutions (2nd ed.). Chicago: University of Chicago Press.

Patterson, M., Warr, P., & West, M. (2004). Organizational climate and company productivity: The role of employee affect and employee level. Journal of Occupational and Organizational Psychology, 77, 193-216.

Pfeffer, J. (1993). Barriers to advancement of organizational science: Paradigm development as a dependent variable. Academy of Management Review, 18, 599-620.

Rousseau, D. M. (2006). Is there such a thing as evidence-based management? Academy of Management Review, 31, 256-269.

Rynes, S. L., Bartunek, J. M., & Daft, R. L. (2001). Across the great divide: Knowledge creation and transfer between practitioners and academics. Academy of Management Journal, 44, 340-355.

Sackett, P. R., & Larson, J. R. (1990). Research strategies and tactics in industrial and organizational psychology. In D. M. Dunnette & L. M. Hough (Eds.), Handbook of industrial and organizational psychology (2nd ed., Vol. 1). Palo Alto, CA: Consulting Psychologists Press. Schmitt, N. (2004). Personal correspondence, September.

Silvester, J., Anderson-Gough, F. M., Anderson, N., & Mohamed, A. R. (2002). Locus of control, attributions, and impression management in the selection interview. Journal of Occupational and Organizational Psychology, 75, 59-76.

Symon, G. (2006). Commentary: Academics, practitioners and the Journal of Occupational and Organizational Psychology: Reflecting on the issues. Journal of Occupational and Organizational Psychology, 79, 167-171.

Symon, G., & Cassell, C. (2006). Beyond positivism and statistics: Neglected approaches to understanding the experience of work. Journal of Occupational and Organizational Psychology, 79, 307-314.

Tranfield, D., & Starkey, K. (1998). The nature, social organization and promotion of management research: Towards policy. British Journal of Management, 9, 341-353.

Van de Ven, A. H., & Johnson, P. E., (2006). Knowledge for theory and practice. Academy of Management Review, 31, 802-821.

Viteles, M. S. (1933). Industrial psychology. London: Jonathan Cape.

Wall, T. (2006). Commentary: Is JOOP of only academic interest? Journal of Occupational and Organizational Psychology, 79, 161-165.

Neil Anderson *

University of Amsterdam Business School, The Netherlands

* Correspondence should be addressed to Neil Anderson, University of Amsterdam Business School, Roeterstraat 1 1, 1018WB Amsterdam, The Netherlands (e-mail:
COPYRIGHT 2007 British Psychological Society
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2007 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Anderson, Neil
Publication:Journal of Occupational and Organizational Psychology
Geographic Code:1USA
Date:Jun 1, 2007
Previous Article:The effects of service climate and the effective leadership behaviour of supervisors on frontline employee service quality: a multi-level analysis.
Next Article:Employee commitment and support for an organizational change: test of the three-component model in two cultures.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters