Printer Friendly

Sources of evidence to support systematic reviews in librarianship (EC).

INTRODUCTION

Systematic reviews provide answers to focused clinical questions through a rigorous and comprehensive methodology designed to limit bias [1]. The search for evidence to answer these questions therefore should be as thorough as resources permit [2]. As in other fields, systematic reviews of library and information science topics can answer questions in the field and inform best practices. This paper reports on the productivity of sources of evidence for such reviews and determines which are most efficient, alone and in combination.

METHODS

Three consecutive and recently completed systematic reviews on issues of information retrieval provided an opportunity to retrospectively analyze the sources of relevant evidence:

* The Checking Reference Lists (CRL) review [3] examined research into the utility of checking reference lists as a method to identify studies for systematic reviews.

* The Updating Systematic Reviews (Updating) project identified and summarized existing methods and strategies for updating as a first step in an ongoing research initiative [4].

* The Peer Review of Electronic Search Strategies (PRESS) review [5, 6] analyzed common errors in search strategies and proposed safeguards.

In the original 3 reviews, reviewers read 14,727 bibliographic records resulting from searches conducted to support the reviews and, when needed, the full-text articles to assess them against the reviews' eligibility criteria. This process yielded 142 relevant documents to include in at least 1 of the 3 reviews.

In the current study, 11 databases were examined for coverage of these 142 eligible studies: 3 MEDLINE search interfaces (Ovid MEDLINE; OVID HealthSTAR, a version of HealthSTAR with coverage to the present [7]; and PubMed); EMBASE; Library, Information Science and Technology Abstracts (LISTA); Library and Information Science Abstracts (LISA); Cochrane Methodology Register (CMR); CINAHL; PsycINFO; Cochrane Database of Methodology Reviews (CDMR) (later absorbed into Cochrane Database of Systematic Reviews); and Health and Psychosocial Instruments (HAPI). The databases in which the records were originally found had been recorded at the time of the search for the systematic review. The selected databases were searched post hoc for each of the 142 eligible studies to determine where the included items were indexed.

Except where noted, eligible records served as the denominator for calculations of recall (proportion of relevant studies retrieved) and the numerator for calculations of precision (proportion of retrieved studies that are relevant) [8]. Bibliometric characteristics such as distribution of citations among journals were calculated using Reference Manager databases of the saved citations. Based on scope of coverage, journals were classified as library science or informatics, medical librarianship or medical informatics, or medicine (including evidence-based health care and epidemiology).

RESULTS

Electronic bibliographic database searches were the means of identification for 101 of 142 (71%) relevant documents in the original reviews. The rest were identified by methods such as reference list scanning and peer nomination. The most common identifying sources for materials used in the original reviews were MEDLINE (28%) and LISA (21%).

Although 71% of the overall pool of relevant material was originally identified through bibliographic databases, 92% (131 of the 142 documents) were actually indexed in at least 1 of the tested bibliographic databases. Using the number of documents actually indexed in bibliographic databases as the denominator, rather than the total number of relevant documents, overall recall of the original searches was 77%.

Precision of the 3 original searches was low. With 142 documents found to be relevant, the overall precision of the original searches was 0.9% (0.5%, 1.2%, and 0.6% for CRL, PRESS, and Updating, respectively).

Coverage

The MEDLINE search interfaces (Ovid MEDLINE, Ovid HealthSTAR, and PubMed) provided the highest coverage of relevant documents, indexing almost half of the relevant material (Table 1 online). Relative coverage of relevant material was equivalent among the three interfaces. LISA also covered almost half of the relevant material. CMR, LISTA, and EMBASE followed closely; each indexed over one-third of relevant documents; however, CMR had the largest unique component--documents not available from any other database tested (13 documents, 9% of the total). CINAHL covered roughly one-quarter of the relevant literature, while PsycINFO, CDMR, and HAPI provided little or no coverage. About 70% of articles found in any other single source were also indexed in the MEDLINE interfaces.

The relatively low unique contribution of various databases can be better understood by examining the overlap, or degree of redundancy, between databases [9]. Overall, the highest overlap was between LISA and LISTA, the 2 information science databases (Table 2 online). All relevant material indexed by LISA was also indexed by LISTA, and 91% of relevant material indexed in LISTA could be found in LISA. The greatest overlap seen among biomedical databases was between EMBASE and the MEDLINE interfaces: 69% of relevant material indexed by MEDLINE interfaces was also indexed in EMBASE, while 94% of relevant material indexed in EMBASE was also found in the MEDLINE interfaces. CMR, the database with the largest unique contribution, had moderate overlap with the biomedical databases but little overlap with the information science sources.

Resource combinations were examined using the 131 articles indexed in at least 1 of the studied databases as the denominator for calculating coverage (Table 3). Maximum coverage possible by searching 3 databases was 97%, achieved through the combination of 1 MEDLINE interface, LISA, and CMR. Maximum coverage possible through searching 2 databases was 87%, achieved through the combination of a MEDLINE interface and LISA.

Precision

Of the 2 standard performance indicators for information retrieval, recall and precision, recall is of greater concern to systematic reviewers, as complete identification of relevant studies is thought to protect against bias [10]. Given that the MEDLINE interfaces had equal recall, precision or budget limitations may become a deciding factor. The precision of the MEDLINE searches used in the 3 reviews when run in Ovid MEDLINE and OVID HealthSTAR was compared. The HealthSTAR retrieval was smaller in all cases. Overall precision was 0.10% for MEDLINE and 0.11% for HealthSTAR. This is a small absolute difference, but it translated into a 13% decrease in screening burden, avoiding 879 irrelevant records across the 3 reviews.

Document type

Journal articles were the most common type of document retrieved by the searches (n=112, 79%), followed by conference abstracts (n=15, 10%). All journal articles were indexed in 1 or more of the examined databases, as was the single dissertation. However, only one-third of electronic documents were included in the searched resources, and thus two-thirds were unavailable for retrieval by the search.

Bibliometric characteristics

The frequencies of authors and journals both followed standard bibliographic distributions [11, 12], with a few highly productive sources and the remaining material widely scattered. Three journals yielded 10 or more items, and together these accounted for almost a quarter of the material (23%) (Table 4 online). The sources could be characterized as library science or informatics, medical librarianship or medical informatics, and medicine, evidence-based health care, or epidemiology in approximately equal numbers.

DISCUSSION

In an article proposing a practical framework for evidence-based librarianship, Crumley and Koufogiannakis describe six domains or categories of questions based on the daily practice of librarians [13]. The topics of the systematic reviews studied here fit in the domain of "Information Access and Retrieval," but they were also interdisciplinary, as important evidence came from both health and library databases and journals. The library and information science journals found to be most productive overlap to some degree with those in Koufogiannakis et al.'s 2004 survey of librarianship research [14].

The best coverage of the evidence base for the systematic reviews in question was obtained through a combination of one MEDLINE interface, CMR, and LISA. The MEDLINE interfaces provided equivalent coverage of relevant material, so other factors will influence selection of databases for systematic reviews. Ovid MEDLINE is widely used by systematic reviewers [15], but the current analysis indicates that cost savings are possible by searching PubMed. An increase in precision with no loss of recall may be possible by selecting the HealthSTAR subset. When subscription access to databases is an issue, the combination of PubMed and LISTA, both available without cost, provided nearly as much coverage as Ovid MEDLINE and LISA, both of which have access fees.

Evidence for the three systematic reviews came not only from the journal literature, but also from abstracts, books, and technical reports. Gray literature--specifically, conference abstracts, technical reports, electronic citations, and dissertations--composed 12% of the evidence base for these reviews, and another 8% came from books and book chapters. Alberani and Pietrangeli found that 22% of references in scientific publications in selected information science journals were to gray literature, although they noted that over half of these citations were to technical reports and tended to occur more frequently in journals that focused on technical aspects of the field [16]. The current results correspond with their work when only conference reports and theses are considered.

Gray literature is not easily identified from database searching and must be sought through means such as research registries, library catalogs, web searching, citing reference searching, and personal communications [17]. The CMR was an important source for the three reviews, having the most unique coverage of any of the examined databases and, in particular, coverage of gray literature. Many of the CMR abstracts may eventually be published as full articles, or they may represent pilot research that may remain as gray literature, corresponding with Eldredge's comment that librarians have had few incentives to publish in the past [18]. Still, a similar database of research articles and abstracts from all areas of librarianship could be an important contribution to research capacity in librarianship by capturing a significant portion of the published and gray literature in one resource.

This work, like other such surveys of the literature, is based on a relatively small sample of reviews. However, the distribution of included studies conforms to findings of those previous surveys, increasing confidence in these results [19-21]. While specific findings may not generalize to other domains of librarianship, they reflect the sources contributing to one of the areas at the forefront of evidence-based librarianship.

CONCLUSIONS

This study of information sources for 3 systematic reviews demonstrates that the evidence base for information science can be multidisciplinary and, in this case, is drawn from the literature in health care, published literature in information science, and unpublished literature. The searching combination of 1 MEDLINE interface, LISA, and CMR provided the most comprehensive coverage, capturing 95% of the relevant literature included in the original 3 reviews. Freely available sources provided nearly equivalent coverage to subscription sources, removing one potential barrier to the successful execution of systematic research in this area. Access to the unpublished library conference literature could be an important enhancement to research capacity in librarianship. Library and information science researchers may use the findings from this research to make informed, evidence-based, timely, and cost-effective decisions when selecting sources for systematic reviews.

DOI: 10.3163/1536-5050.96.1.63

(EC) Supplemental Tables 1, 2, and 4 are available with the online version of this journal.

Received July 2007; accepted September 2007

REFERENCES

[1.] Klassen TP, Jadad AR, Moher D. Guides for reading and interpreting systematic reviews: I. getting started. Arch Pediatr Adolesc Med 1998 Jul;152(7):700-4.

[2.] McGowan J, Sampson M. Systematic reviews need systematic searchers. J Med Libr Assoc 2005 Jan;93(1):74-80.

[3.] Armour T, Dingwall O, Sampson M. Contribution of checking reference lists to systematic reviews. Presented at: XIII Cochrane Colloquium; Melbourne, Australia; Oct 22, 2005.

[4.] Moher D, Tsertsvadze A, Tricco A, Eccles M, Grimshaw JM, Sampson M, Barrowman NJ. A systematic review identified few methods and strategies describing when and how to update systematic reviews. J Clin Epidemiol 2007; 60: 1095-104.

[5.] Sampson M, McGowan J. Evidence review: evaluating health technology assessment searches. Presented at: XIII Cochrane Colloquium; Melbourne, Australia; Oct 22, 2005.

[6.] Sampson M, McGowan J, Lefebvre C, Moher D, Grimshaw JM. PRESS: Peer Review of Electronic Search Strategies. CADTH Technical Report 2007. (Available at: <http:// www.cadth.ca>. [cited 6 Nov 2007].)

[7.] Ovid HealthStar (HSTR) database guide [web document]. New York, NY: Ovid Technologies, 2007. [rev. 26 Feb 2007; cited 7 Aug 2007]. <http://gateway.tx.ovid.com/rel_live/ server1/fldguide/hstrdb.htm>.

[8.] Salton G, McGill MJ. Introduction to modern information retrieval. New York, NY: McGraw-Hill Books, 1983.

[9.] Sampson M, McGowan J, Armour T, Cogo E. Managing database overlap in systematic reviews using Batch Citation Matcher: case studies using Scopus. J Med Libr Assoc 2006 Oct;94(4)461-3,E219.

[10.] Higgins JPT, Green S. Cochrane handbook for systematic reviews of interventions 4.2.6. (updated September 2006; section 5) [web document]. Chichester, UK: John Wiley and Sons, 2006. [rev. 1 Sep 2006; cited 28 Aug 2007]. <http://www .cochrane.org/resources/handbook/Handbook4.2.6Sep2006 .pdf>.

[11.] Bradford's law. In: Black PE, ed. Dictionary of algorithms and data structures [web document]. Gaithersburg, MD: National Institute of Standards and Technology, 2004. [rev. 17 Dec 2004; cited 8 Aug 2007]. <http://www.nist.gov/ dads/HTML/bradfordsLaw.html>.

[12.] Lotka's law. In: Black PE, ed. Dictionary of algorithms and data structures [web document]. Gaithersburg, MD: National Institute of Standards and Technology, 2004. [rev. 17 Dec 2004; cited 8 Aug 2007]. <http://www.nist.gov/ dads/HTML/lotkaslaw.html>.

[13.] Crumley E, Koufogiannakis D. Developing evidence-based librarianship: practical steps for implementation. Health Info Libr J 2002 Jun;19(2):61-70.

[14.] Koufogiannakis D, Slater L, Crumley E. A content analysis of librarianship research. J Inf Sci 2004 Mar;30(3):22739.

[15.] McGowan J, Sampson M, Santesso N. Collection development in support of Cochrane reviews: normative data on sources and database interface from 105 Cochrane reviews. Presented at: Upstate New York and Ontario Chapter of the Medical Library Association Annual Conference; Ottawa, ON, Canada; Oct 13, 2004.

[16.] Alberani V, Pietrangeli PD. Grey literature in information science: production, circulation and use. INSPEL 1995 Apr; 29(4):240-9.

[17.] Khan KS, Kunz R, Kleijnen J, Antes G. Systematic reviews to support evidence-based medicine: how to review and apply findings of healthcare research. London, UK: Royal Society of Medicine Press, 2003.

[18.] Eldredge JD. Evidence-based librarianship: searching for the needed EBL evidence. Med Ref Serv Q 2000 Fall;19(3):118.

[19.] Day D, Furlan A, Irvin E, Bombardier C. Comparing databases and search strategies for systematic reviews of musculoskeletal disorders. Presented at: XI Cochrane Colloquium; Barcelona, Spain; Oct 2003.

[20.] Avenell A, Handoll HHG, Grant AM. Lessons for search strategies from a systematic review, in The Cochrane Library, of nutritional supplementation trials in patients after hip fracture. Am J Clin Nutr 2001 Mar 3;73(3):505-10.

[21.] Kleijnen J, Knipschild P. The comprehensiveness of Medline and Embase computer searches. searches for controlled trials of homeopathy, ascorbic acid for common cold and ginkgo biloba for cerebral insufficiency and intermittent claudication. Pharmaceutisch Weekblad Sci Ed 1992 Oct; 14(5):316-20.

Margaret Sampson, MLIS; Raymond Daniel, BA; Elise Cogo, ND; Orvie Dingwall, MLIS

Margaret Sampson, MLIS (corresponding author), msampson@cheo.on.ca, Chalmers Research Group, Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON, Canada, and Department of Information Studies, University of Wales, Aberystywth, United Kingdom; Raymond Daniel, BA, rdaniel@ cheo.on.ca, Chalmers Research Group, Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON, Canada; Elise Cogo, ND, ecogo@cheo.on.ca, Chalmers Research Group, Children's Hospital of Eastern Ontario Research Institute, Ottawa, ON, Canada; Orvie Dingwall, MLIS, odingwall@cpsi-icsp.ca, Canadian Patient Safety Institute, Edmonton, AB, Canada
Table 1
Most productive databases based on indexing
of relevant documents (n=142)

                     Total        Unique
                   documents     documents

Source             n     %       n     %

MEDLINE            70    49     2 *     1
HealthSTAR         70    49       *     *
PubMed             70    49       *     *
LISA               68    48       6   4.2
CMR                54    38      13     9
LISTA              54    38       1   0.7
EMBASE             51    36       0    --
CINAHL             40    28       1   0.7
PsycINFO            7     5       2     1
CDMR                0    --      --    --
HAPI                0    --      --    --

* Coverage of included studies by the 3 MEDLINE interfaces
(MEDLINE, HealthSTAR, and PubMed) was identical. These 2
documents were in all 3 MEDLINE interfaces, but none of
the other databases that were searched.

Table 2
Overlap between bibliographic databases

Proportion of         That were also indexed in ...
relevant
documents             MEDLINE
indexed in ...        interfaces   EMBASE   CINAHL     CMR

MEDLINE interfaces       1.00       0.69     0.44      0.56
EMBASE                   0.94       1.00     0.38      0.13
CINAHL                   0.73       0.60     1.00      0.28
CMR                      0.72       0.56     0.20      1.00
LISA                     0.74       0.25     0.38      0.13
LISTA                    0.70       0.26     0.37      0.06
PsycINFO                 0.71       0.29     0.00      0.29

Proportion of         That were also indexed in ...
relevant
documents
indexed in ...           LISA      LISTA    PsycINFO

MEDLINE interfaces       0.34       0.26     0.06
EMBASE                   0.31       0.25     0.08
CINAHL                   0.65       0.50     0.00
CMR                      0.17       0.06     0.04
LISA                     1.00       1.00     0.03
LISTA                    0.91       1.00     0.02
PsycINFO                 0.29       0.14     1.00

Table 3
Database combinations

                                   % of       % of        Recall
                                 indexed    included       with
                                 articles   articles      actual
                            N    (n=131)    (n=142)     searches *

MEDLINE interface,
  Library and
  Information
  Science
  Abstracts (LISA),
  Cochrane Methodology
  Register (CMR)           127    9690%       89.4     95/131, 72.5
MEDLINE interface,
  Library, Information
  Science and
  Technology Abstracts
  (LISTA), CMR             120     91.6       84.5          --
MEDLINE interface, LISA    114      87        80.3     70/131, 53.4
MEDLINE interface, LISTA   106     80.9       74.6          --
MEDLINE interface, CMR     85      64.9       59.9     57/131, 43.5

* Recall with the 2 database combinations may represent slight
underreporting due to the order of duplicate removal. In Peer
Review of Electronic Search Strategies (PRESS), LISA records
were retained in preference to MEDLINE records, which were
retained in preference to CMR records. Thus, had only a
MEDLINE interface and CMR been searched, the recall may have
been higher, as more National Library of Medicine records
would be present. In Checking Reference Lists (CRL), CMR records
were kept in preference to LISA records. The impact here is
likely to be small, as there was little overlap between those
records and CRL composes a small percent of the total.

Table 4
Most productive sources

 Number
   of                                                     Cumulative
articles                Source                 Category       %

   12      Journal of the Medical Library         ML          8.5
             Association (continues Bull
             Med Libr Assoc)
   11      Journal of the American Society        L          16.2
             for Information Science
   10      Cochrane Colloquium                    M          23.2
             (various years)
   5       Health Information &                   ML         26.8
             Libraries Journal
             (continues Health
             Libraries Review)
   5       Information Processing                 L          30.3
             & Management
   4       Journal of the American                ML         33.1
             Medical Informatics
             Association
   4       Online Review                          L          35.9
   3       British Medical
             Journal (BMJ)                        M          38.0
   3       International Journal of               M          40.1
             Technology Assessment
             in Health Care
   3       Online                                 L          42.3
   2       British Journal of                     M          43.7
             Medical Psychology
   2       Computers &                            ML         45.1
             Biomedical Research
   2       Journal of the American                M          46.5
             Medical Association (JAMA)
   2       Journal of Clinical                    M          47.9
             Epidemiology

L=library science or informatics, ML=medical librarianship
or medical informatics, M=medicine including
evidence-based health care and epidemiology.
COPYRIGHT 2008 Medical Library Association
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2008 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:BRIEF COMMUNICATIONS
Author:Sampson, Margaret; Daniel, Raymond; Cogo, Elise; Dingwall, Orvie
Publication:Journal of the Medical Library Association
Article Type:Report
Geographic Code:1CANA
Date:Jan 1, 2008
Words:3167
Previous Article:Analyzing the impact of an author's publications.
Next Article:Integrating information literacy into the education of public health professionals: roles for librarians and the library.
Topics:

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters