Printer Friendly

Determining how libraries and librarians help.


THIS ARTICLE EXAMINES THE QUESTION, "What differences do libraries and librarians make?" primarily from the perspective of geographical communities. The article first states the reasons why this is an essential research question and describes the contributions of current public library planning tools to the determination of impact. It then takes a broad look at the framework that is essential for the intellectual development of this topic and the ability to answer the question, including methodological approaches and theoretical frameworks that will be discussed throughout. While the authors pose this research problem as an evaluation question, this article examines contributions of research in several areas--particularly professional practice, especially reference research that has been informed by qualitative methods--to its solution. Finally, the authors examine approaches to studying context as a framework for determining the impacts of library services and include a brief presentation of findings from a recent study of "How Libraries and Librarians Help: Context-Centered Methods for Evaluating Public Library Efforts at Bridging the Digital Divide and Building Community," funded by the Institute of Museums and Library Services (IMLS). (1)


Periodically, the field becomes aroused because libraries have been overlooked in a landmark study of societal institutions, ignored in a major government report, or omitted from important legislation that could improve libraries' capacity to contribute to the solution to a societal problem. Why, professionals ask themselves, could the library have been ignored in this major study of X or this major federal initiative involving Y? In an essay entitled "Where are Libraries in Bowling Alone?" Jean Preer, like many before her, bemoaned the fact that "libraries are notably absent" from the consciousness of a major researcher or decision-maker. In this case the work was Robert Putnam's "compelling and widely-heralded work" on social capital (Preer, 2001, p. 60; Putnam, 1995; Putnam, 2000). Throughout her short article Preer asserts (to the readers of American Libraries) that libraries do, indeed, foster social capital, and that Putnam has ignored their contributions. She argues that for more than a century public libraries have worked to create an informed citizenry and to build community. Preer concludes that libraries contribute to most of the conditions that Putnam predicts will create "a more engaged civic and community life" including stimulating the civic engagement of young people and fostering tolerance, arts and cultural activities, and activities that inform citizens (Preer, 2001, p. 62).

Documenting the number of times the kinds of concerns raised by Dr. Preer have been voiced would fill many more pages than are allotted for this entire issue. At one point Preer quotes 1934 ALA President Gratia A. Countryman's response to the absence of libraries in a major 1930s study of American life: "What have we done or not done that this can be so? Why is it that we have not impressed ourselves, as an important and essential institution, upon the governing body or upon intelligent authors and scholars? Is it in the very nature of our work that it should be so, or is it in ourselves?" (Preer, 2001, p. 62). Since that time libraries have been absent from scores of major studies of societal issues, major legislation designed to solve societal problems, and the funding priorities of a number of foundations.

Preer's frustration "That Putnam could miss the connection is a distressing reminder of the way in which libraries are simultaneously ignored and taken for granted" reflects the frustration expressed by generations of librarians and researchers (Preer, 2001, p. 62). Putnam, of course, is only one of many influential individuals or organizations over the decades who have lacked the awareness of existing and potential impacts of library services necessary to assess libraries' contributions to the solutions to particular societal issues or problems. It is easy to replace "Putnam" with any number of major researchers, the federal government, the media, local decision-makers, etc. The sheer number of individuals and institutions who have failed over the decades to see the contributions of libraries to society should alert the field that the messages currently being sent do not convey the contributions that libraries and librarians make to their communities.

Authors in this issue were charged to identify significant and researchable questions, describe prior research that could prove useful, and suggest methodologies for future work. This article addresses the broad question, "What differences do libraries and librarians make in the lives of individuals, their families, neighborhoods, the community organizations that serve them, and the larger community?" It is essential to realize that this basic question has been elusive for a century. Librarians as a profession have been committed to excellence during this entire period, but have lacked the tools that could provide the answers. Efforts of librarians to quantify excellence for several decades were focused on standards, inputs, and more recently, outputs, none of which are capable of answering that question. In the last decade of the twentieth century two quite different external forces--1. the radically changed environment in which libraries operate, and 2. the pressure from external agencies for institutional accountability--brought this question to the attention of both librarians and researchers.


There is a major demand across the public sector for accountability that began, coincidentally, with the development of the Internet. This demand began at the federal level of government: "Fiscal conservatism, the devolution of responsibility to the states, and skepticism about social programs [are now driving both evaluation and] national policy making" (Rossi et al., 1999, p. 19). In a recent article these authors addressed this important question from the perspective of pressures that are forcing librarians to begin to seek out indicators and measures of outcome (Durrance and Fisher-Pettigrew, 2002). (2) In that article we discussed the convergence of factors within and outside of librarianship that has created an environment conducive to the development and use of indicators of impact of library services. Advances in evaluation research are certainly an important enabling factor. More importantly, however, demands for public-sector accountability and governmental activities aimed at determining service outcomes have driven the widespread need in the public sector (and among nonprofits) for identifying and adopting outcome measures.

Reflecting a loss of citizen confidence in the work of governmental agencies, the 1990s brought a convergence of thought among decision-makers that federal, state, and local governmental agencies, institutions, and nonprofit organizations must begin to reshape public services and products to focus more on accountability. During that period the U.S. federal government identified reinventing government as a priority and focused on developing approaches government agencies could use to demonstrate their accountability (Osborne & Gaebler, 1992). Two federal initiatives have guided these government mandates: the Government Performance and Results Act (GPRA) of 1993 and the Government Accounting Standards Board Concepts Statement #2 in 1994 (Institute of Museum and Library Services [IMLS], 2000; Multnomah County Auditor's Office, 2000). GPRA requires every government agency "to establish specific objective, quantifiable, and measurable performance goals for each of its programs. Each agency must annually report to Congress its level of achievement in reaching these goals" (Sheppard, 2000). "When GPRA is fully implemented, it will directly impact state and local governments that receive Federal funding by requiring them to report on program results" (Multnomah County Auditor's Office, 2000, p. 2). Thus, demand for public sector accountability is a key factor in the changing evaluation horizon across the public sector.

The federal agency most concerned with public library development and excellence, IMLS, poses the question, "What differences do libraries and museums make?" While the federal government, through the work of IMLS, demands that librarians develop measures of outcome that will indicate "benefits to people: specifically, achievements or changes in skill, knowledge, attitude, behavior, condition, or life status for program participants" (IMLS, 2001), the approaches most commonly used to evaluate libraries are still focused on the institution rather than its users. IMLS has warned that "if museums and libraries do not take the responsibility for developing their own set of credible indicators, they risk having someone else do it for them" (IMLS, 2000). These moves toward accountability bring the public sector into an era of mandated development of outcomes. Because there is now an urgency to articulate messages that resonate with those who influence public policy decisions, there has been a rush to develop ways to measure outcomes. It is essential that this work is informed by relevant research.


For well over a century, the public library, an American invention, has worked to make contributions to the lives of citizens of the community. The literature of that effort is quite extensive and out of the scope of this article. We note, however, that this literature examines the broad-ranging roles that public libraries have undertaken in their communities (Molz & Dain, 1999; Van Slyck, 1995). The breadth of services undertaken by this institution led over time to the development of several generations of planning tools that have increased public library planning effectiveness and the development of effective mission statements, goals, and objectives (Palmour et al., 1980; McClure et al., 1987; Himmel & Wilson, 1998; Nelson, 2001).

These tools have fostered a new generation of mission statements that seek to distill the library's purposes and values while articulating the approaches used to fulfill them. Statements developed today often emphasize the needs the library seeks to meet. Mission statements reflect the desire of librarians to show that libraries serve a vital role in their community. Public library mission statements increasingly are framed to indicate the value of the public library to the community from the perspective of its contributions to the lives of citizens. Mission statements show that libraries seek to:

* "promote the development of independent, self-confident, and literate citizens"; (3)

* "enhance the personal development" of citizens "by seeking to meet their informational needs, recognizing the benefits to the community of a well-informed citizenry, the individual's capacity for self-improvement, the worth of each person and the need for human dignity"; (4)

* "inform, enrich, and empower every individual in its community by creating and promoting free and easy access to a vast array of ideas and information and by supporting lifelong learning in a welcoming environment." (5)

These mission statements could lay the groundwork for developing more effective indicators of the impacts of public libraries in their communities and help shape the activities that lead to relevant community outcomes.

Unfortunately, the planning and assessment tools mentioned above fail to provide mechanisms to move public libraries to make the conceptual leap involved in developing outcomes based on these strong statements of commitment to the community. That is a big order, and the research that would support these actions has been slow to materialize. Therefore, these tools still focus evaluation efforts on public library output measures. These measures, in use for nearly twenty years in one form or another, were designed to move public libraries beyond the time-honored, but limiting, measure of circulation. Developers added other measures of use, including annual library visits, in-library materials use, turnover rates, program attendance, and reference questions, etc. These measures, first introduced in the 1980s, all include a calculation to determine per capita usage, and provide tested approaches to collect and analyze data on a variety of indicators of library use (Nelson et al., 2000; Van House et al., 1987). The output measures began as a well-intentioned move away from heavy reliance by public libraries on input measures mandated by public library standards. Public librarians, state agencies, and the federal government have come to rely on output measures for public libraries as indicators of public library effectiveness. While the primary values of these measures are as indicators of efficiency and use, they do not reflect value gained by the user. Yet, output data are being collected on a statewide basis by state library agencies and analyzed at state and federal levels.

Further, output measures, particularly those focusing on circulation and materials, have become the basis for additional, related measures including the controversial Hennen's American Public Library Rating system (HAPLR). The HAPLR weighting system compounds the emphasis on circulation by factoring this element into the index at least six times (cost per circulation, collection turnover, circulation per FTE hour, circulation per capita, circulation per hour, circulation per visit). Hennen has used the HAPLR index to identify the "best" libraries in the nation (Hennen, 2002). Certainly the data suggest that high circulation coupled with low staffing costs appear to be the key to an effective library. Librarians who use HAPLR to evaluate their libraries are likely to focus their energies in those areas that are emphasized by this index. It is not difficult to imagine that HAPLR libraries will add multiple copies of currently requested materials, especially videos and other materials whose circulation periods are short, to increase their score, since the HAPLR has selected and featured the "top" libraries in the nation. There is no doubt that some of the libraries on the top of the HAPLR list are some of the best in the nation, but these institution-focused measures fail to determine the contributions of these libraries to their communities.

An attempt to overcome this weakness in the planning tools and to provide a bridge to outcome measures was undertaken by researchers at the Colorado State Library's Library Research Service (LRS). LRS worked with selected libraries that use the Public Library Association's Planning for Results guides by designing data collection instruments for several categories derived from the original thirteen Planning for Results service responses (Steffen et al., 2002; Steffen and Lance, 2002; Lance et al., 2002). LRS's Counting on Results (CoR) project worked with forty-five test public libraries to collect outcome data on six library service responses that had been modified by participating libraries and CoR researchers--Basic Literacy, Business and Career Information, Library as a Place (Commons), General Information, Information Literacy, and Local History & Genealogy. Researchers worked with librarians to identify candidate outcomes and then developed for participating libraries a standard oversized postcard survey form for each of the modified library service responses to determine the extent to which each outcome was present in each library. This approach resulted in the identification of a range of candidate outcomes that librarians conjectured might emerge from the chosen service responses.

The researchers indicate that more libraries (twenty-five) distributed and collected survey forms on General Information (GI) than any of the other service responses.
 [GI] outcomes were the most popular, including the highest
 percentage of respondents for a single outcome. Indeed the least
 popular GI outcome was more frequently reported than the least
 popular outcomes for other responses. These trends indicate that
 not only does this [service response] apply to the greatest number
 of libraries, it is also the most relevant to the largest number of
 library patrons. (Steffen et al., 2002a)

The most widely reported outcome--"read for pleasure"--however, fails to capture the essence of an outcome--in other words, "achievements or changes in skill, knowledge, attitude, behavior, condition, or life status for program participants" (IMLS, 2001). For this most popular service response, it appears that librarians and researchers identified a relatively weak set of candidate outcomes, in all probability because they failed to collect data resulting from specific GI encounters by library users.

On the other hand, the candidate outcomes suggested by librarians and Counting on Results researchers for more focused services where librarians are more likely to understand their users better provide more promise. This is seen in the Business and Career Information SR, where some respondents agreed that they had "developed job-related skills," or in Basic Literacy, where selected respondents responded that they had "became a citizen" or "prepared for the naturalization exam" or "helped a child do homework or improve grades" (Steffen et al., 2002). CoR researchers, however, were concerned that: 1. some survey questions (such as "became a citizen") may have been misunderstood, and 2. the more focused surveys yielded very few responses to most of the project's selected outcomes. The data also suggest that the methods used to collect outcome data need to be designed to capture the context of a specific service model. Contextual outcomes will be discussed later in this article.

This early set of candidate outcomes for public libraries brings both promise and concerns. The focused service responses offer the most promise. However, they may not measure the full impact of public library services since they were not generated through user-focused research. Rather, they were identified first by librarians and then tested with a broad range of users. Thus, if librarians underestimate the impact of their services and then test these guesstimates, the measures they choose will not reflect the full impact of their services.


Research on the reference interview, discussed below, has made strong contributions to our understanding of the impacts of library service. This research--which arose to answer one question: how accurately do librarians answer questions?--evolved over time to focus on theoretical approaches to the nature of the interaction and can be thought of as a model for examining the emerging research that will answer the question, "What differences do libraries and librarians make?" Along the way, researchers not only identified a range of negative outcomes of poorly constructed reference interviews, but they also showed that the integration of research findings into professional practice resulted in improved outcomes.

Gains from Reference Research

Research on the reference interview in the past several decades has been transformed from what had been considered a topic far too difficult to be amenable to effective research studies to a synergistic body of knowledge that can elucidate the context of seeking information from a mediator or system. The small, but representative, sample of several decades of research on the practice that librarians call reference--particularly research findings that have been shaped by the effective use of qualitative research approaches--shows that what is considered a researchable question in LIS, just as in other fields, has built on the questions raised and partially answered by a succession of researchers. The most effective work in translating these knowledge gains based on research into practice--and ultimately providing a framework for reference librarians to use to more effectively help people solve their information problems--has been done by Catherine Ross and Patricia Dewdney and more recently by their colleague Kirsti Nilsen (Ross & Dewdney, 1998; Ross et al., 2002).

While it took some time for synergistic outcomes to appear, it is clear now that the knowledge gains made in this area have helped us to begin to answer the question, "how do libraries and librarians help?" Starting with Robert Taylor in the late 1960s, researchers began to realize that reference, long thought to be an art that was difficult to transmit to novices, was a potentially rich research problem (Taylor, 1968). Much of the research discussed below made use, at least in part, of qualitative methodologies.

Early Research Questions

A number of researchers, most recently Ross, Nilsen, and Dewdney, have traced the considerable research knowledge gains in the thirty-year period that began since the pioneering work of Terry Crowley and Tom Childers (Ross et al., 2002; Radford, 1999). The early research that spawned such a rich body of knowledge sought to measure the effectiveness of reference by determining accuracy rates using questions developed by the researchers. The answers to that early research question (how accurately do reference librarians answer questions?) raised even more interesting research questions (such as, is this the right question for the researcher to ask?) that were amenable to qualitative approaches. The unobtrusive approaches used by Crowley and Childers and replicated repeatedly by scores of other researchers in the 1970s and 1980s showed that librarians consistently failed to accurately answer factual questions about half the time (Hernon & McClure, 1986). Very importantly, however, by the early 1970s researchers had learned that one of the major ways that librarians interact with people, the reference encounter, was a very researchable problem.

The Process of Building on Previous Research

The early work of Crowley and Childers sparked the interest of other researchers such as Lynch (1978) whose own work continued the synergistic knowledge gains. Additional gains in knowledge about how librarians help (and hinder) emerged from the research by Dewdney (1986). Both Lynch and Dewdney determined that, when actual interviews were recorded in their natural setting using unobtrusive approaches, the research problem was actually fairly complex. Many questioners in libraries phrased questions from a system perspective, "Do you have any books on?" People often failed to state their information needs in the initial question. This research, using real interviews, also showed that far too often, approximately half the time, the staff member answered the question directly rather than negotiating it, resulting in a failure by the questioner to get a satisfactory response from the librarian (Ross et al., 2002, p. 8). Research studies during the 1980s and 1990s identified specific approaches used by librarians that hinder rather than help those who seek information. Researchers also learned in that period that accuracy, while a noble goal of practice, was not the single or perhaps even the best measure of reference effectiveness because it focused on the question rather than the questioner.

Durrance's research, using unobtrusive approaches and questions formulated by observers, proposed and tested a new indicator--willingness to return to the staff member in the reference interview--against a variety of interpersonal and search variables and found that interpersonal variables are key to the success of the interaction (Durrance, 1989, 1995). This research and that of Dervin and Dewdney (1986), Dervin and Clark (1987), Dyson (1992), Dewdney and Ross (1994), Ross and Dewdney (1998), and Ross et al. (2002) show how particular communication approaches and behaviors (the use of open questions, follow-up questions, attention to closure, etc.) boost the effectiveness of the reference interaction. Dyson and her colleagues showed that librarians could be taught to improve the reference experience for questioners by identifying and overcoming common failures (Dyson, 1992).

Job and Career Centers--Community Information Reference Services

The recession of the late 1980s and early 1990s brought new, community-focused, need-based services to public libraries that built on knowledge gains made by reference researchers, especially that work that had been conducted in public libraries. These community-focused services, including Job Information Centers (JICs), also helped librarians understand the information needs of job seekers, including blue collar workers who had lost the jobs they had held for decades, displaced homemakers, and professionals unable to get work in a declining economy. Several years ago Durrance identified a rich set of strategies used by staff of a number of job and career information centers (Durrance, 1991a, 1991b, 1993, 1994). Staff noticed that many of those who used job centers were not typical library users and did not understand the library as an information center. They saw people who were desperate to get information about the job market and how they fit into it. Staff in these centers began to sort out the variety of needs that people who are unemployed or underemployed bring to a trusted community resource (in this case, the library). They used a variety of approaches including computer software to help people assess their skills/ options. Staff were well connected in the community and so collaborated with other agencies. These activities facilitated appropriate referrals to other community organizations. Staff expanded their array of resources by providing a broad range of computer, video, and print resources on jobs and careers. They began to provide specialized reference services, including answering an array of questions that built on each individual's situation. They also honed the interviewing skills of staff and provided access to advising and career counseling sessions by appointment. Focusing on the needs of their clientele,job center staff developed workshops that focused on specific needs such as resume writing, interviewing, starting a business, etc. (Durrance, 1993, 1994). As a result, staff in these libraries realized that they were making a difference in the lives of their clientele through the numerous testimonials they and their administrators received, although at that time no tools existed to help them systematically document their contributions to the community. Durrance (1994) developed preliminary evaluation approaches to help bridge this gap.

Contributions of Theoretical Frameworks to Reference Research

Application of theoretical frameworks during the 1980s and 1990s further enriched researchers' ability to more effectively focus on the questioner. Dervin's theory of sense-making has been used by researchers to show that the best responses to queries are those that help users solve the problem behind the question (Dervin & Dewdney, 1986). This theoretical framework has led to the development of more effective approaches to the reference interview through the use of sense-making questions. (For a summary see Ross et al., 2002, pp.93-101.) It appears that successful outcomes for users have increased because professionals have learned how to employ these approaches effectively (Ross et al., 2002, p. 98).

The application of the theory of mental models, while not as widely used as sense-making, has the potential for making strong contributions to knowledge growth and improved professional practice. Cognitive scientist Donald Norman and others developed the theory of mental models to better understand the major discrepancies between the user and developers of systems. "In interacting with the environment, with others, and with the artifacts of technology, people form internal mental models of themselves and of the things with which they are interacting. These models provide predictive and explanatory power for understanding the interaction"; further, "[p]eople's mental models are apt to be deficient in a number of ways, perhaps including contradictory, erroneous, and unnecessary concepts...." In short, they are "messy, sloppy, incomplete, and indistinct" (Norman, 1993, pp. 7, 14).

Gillian Michell and Patricia Dewdney, using this theoretical framework, show that it can successfully elucidate the intractable problem first identified by Lynch (1978) and Dewdney (1986) of poorly formed user queries coupled with a tendency among many librarians to take these ill-formed questions at face value (Dewdney & Michell, 1996; Michell & Dewdney, 1998; Michell & Dewdney, 2002). This phenomenon--drawn from linguistics and called "ill-formed query" by Dewdney and Michell--is applied to "a question that doesn't work because it leads to erroneous inferences" (Ross et al., 2002, p. 22). Their research shows that ill-formed questions often lead to reference interaction failure. The Michell-Dewdney Mental Models Study compares the mental models of questioners with those of librarians by observing actual reference interviews and then interviewing both the user and the librarian. This research examined the following questions: "Does the librarian's understanding of the system (including the collection and its organization, the physical layout, her own role in that system, and the characteristics, values and beliefs of the user) differ in any important way from the user's understanding of that system and its role with respect to the situation from which the information need arose, the user's beliefs and attitudes towards libraries as places to solve problems, and the uses to which the user plans to put the information? If there is an important difference, does either the librarian or the user discover it, and how does that discovery affect the outcome of the transaction?" (Michell and Dewdney, 2002). The theoretical framework and the methods used to collect the data have allowed these researchers to show how the user's mental model of the transaction (and to some extent of the library system) differs from that of the librarian. This theoretically based research further helps researchers and practitioners understand the important discrepancies in the mental models of librarians and questioners. The next section of the paper focuses on more theoretically driven information behavior research which has brought about a greater understanding of social contexts.

For several decades theory has shaped the research focused on information behavior. The section below discusses, in particular, research that has begun to focus specifically on understanding social contexts.


Marcia Bates, one of the field's most distinguished LIS researchers, has identified the three key questions associated with LIS research (Bates, 1999). They are:

1. The physical question: What are the features and laws of the recorded-information universe?

2. The social question: How do people relate to, seek, and use information?

3. The design question: How can access to recorded information be made most rapid and effective?

Bates' second question drives the work of numerous researchers across the world who study information behavior that is surely related to the question of "What differences do libraries and librarians make?" Yet in the late 1990s, Jorge Schement warned that librarians "lag in [their] understanding of the evolving social context--a context in which libraries will have to justify themselves," and suggested that libraries consider "how Americans [will] live their lives as citizens, as economic actors, and as social beings" in the coming decades (Benton Foundation, 1997, p. vi). The research framework discussed below will make increasing contributions to practice as librarians move to determine the impacts of their professional contributions and those of their institutions.

Recent Research that Informs Context

In a recent ARIST review, Pettigrew, Fidel, and Bruce (2001) synthesize recent advances and conceptual growth in the field increasingly known as information behavior research--defining this research as "the study of how people need, seek, give and use information in different contexts, including the workplace and everyday living" (Pettigrew et al., 2001). These authors show the role of theory in shaping research on information behavior, providing examples of information research that has "focused on the user as an individual, cognitive being and on the behaviors associated with information processing" (Pettigrew et al., 2001). They remind us of the rich knowledge gains by information behavior researchers informed by the theoretical work of Dervin, Kuhlthau, and others. This most recent literature review of information behavior research shows a greater focus by researchers on context.

Seeing the need for a better understanding of contextual factors, researchers looked to a new vehicle for sharing context-focused information behavior research, the international conference, Information Seeking in Context (ISIC). An increasing number of researchers have begun to shape our understanding of context since the first ISIC Conference in 1996 (Vakkari, et al., 1997). This emerging body of context-focused research should make strong contributions to the question, "What differences do libraries and librarians make?" Carol Kuhlthau warns that "[to neglect context] is to ignore the basic motivations and impetus that drives the user in the information seeking process" (Pettigrew, 1999, p.802).


Knowledge Gains Resulting from Qualitative Methods

Throughout this article qualitative research methods and approaches receive particular attention because it is the assumption of the authors that this framework provides the researcher with a variety of tools that can be used to understand the complex interactions that shape phenomena of study including the impacts of libraries and librarians on society.

Qualitative research, as defined by Creswell, is "an inquiry process of understanding based on distinct methodological traditions of inquiry that explore a social or human problem. The researcher builds a complex, holistic picture, analyzes words, reports detailed views of informants, and conducts the study in a natural setting" (Creswell, 1998 p. 15). "Accordingly," Denzin and Lincoln (2000) add, "qualitative researchers deploy a wide range of interconnected interpretive practices, hoping always to get a better understanding of the subject matter at hand" (p. 3). The use of qualitative approaches allows the researcher the flexibility to look closely to describe and explain. These frameworks, especially when informed by theory, bring a user perspective to agency evaluation. Qualitative approaches can "illuminate aspects of libraries, library services, and library users' perspectives in ways we have not had access to in previous research" (Lincoln, 2002).

LIS has benefited over the past two decades from work done by researchers using qualitative approaches. For example, starting in the 1980s, Carol Kuhlthau's extensive work on the information search process has used theoretically grounded qualitative approaches to give the field not only a framework for understanding a range of cognitive and affective states associated with the search process (factors that strongly influence the outcomes of any search), but also an understanding of the various--and very different--stages of the search process. Kuhlthau's research has shown that these now well-known stages--initiation, selection, exploration, formulation, collection, and presentation--can be understood both by those who experience them and by information professionals who can, by understanding them, develop appropriate intervention strategies (Kuhlthau, 1991, 1993, 1994, 2001). A longitudinal study of her initial group of informants indicated the positive impact on the seeker of understanding the search process (Kuhlthau, 1999).

Pioneered by Brenda Dervin in the 1970s, sense-making studies employing qualitative methods have been conducted for decades (Dervin et al., 1976). This work has made strong contributions to information behavior research; it can also be seen as contributing to an understanding of the impact of library services. In a project funded by the State Library of California, Dervin and Clark (1987) identified a range of user-identified "helps" (outcomes) associated with public library services. Dervin's categories of "helps," framed from the perspective of the general library user, included: got ideas/understandings about something; accomplished something; decided what to do or when or how to do it; got rest and relaxation and a quiet retreat; got motivated to do something; felt good about myself, my decision, my circumstances; calmed down and eased my worries; felt like I belonged and was not alone; got pleasure, entertainment, and happiness. The purpose of Dervin and Clark's overall study, which was well ahead of its time, was to bring sense making approaches to librarians so that they might collect use data "in human terms" (Dervin & Clark, 1987, p. 1). These methods laid the groundwork necessary to determine the outcomes implied in the research question examined in this article. Most information behavior researchers who use qualitative approaches also enrich this research through theory application and development.

Evaluation Methodologies

Determining the differences libraries make is most often framed as an evaluation problem. Evaluation is generally seen as the assessment of various aspects of programs, including: "(a) the need for the program, (b) the design of the program, (c) the program implementation and service delivery, (d) the program impact or outcomes, and (e) program efficiency" (Rossi et al., 1999, p. 33). Evaluation as a social science came of age in the 1970s (Rossi et al., 1999, p.11). As evaluation as a field has matured, evaluators have increasingly employed qualitative approaches to evaluation questions. Indeed, 20 percent of the authoritative Handbook of Qualitative Research is devoted to an examination of "The Art and Practices of Interpretation, Evaluation, and Representation" (Denzin & Lincoln, 2000, pp. 870-1065).

In fact that bastion of the scientific method, the National Science Foundation, has funded evaluation studies employing qualitative methods. For example, the NSF-funded study conducted by Mark et al. (1997) effectively used qualitative approaches to determine the benefits of community technology centers. The authors reported that using a community technology center brought, in aggregate, a variety of benefits, including work-related benefits such as improved job skills, improved computer skills, access to employment opportunities; educational benefits including an improved outlook on learning new skills and knowledge; a variety of personal efficacy and affective outcomes, including general life improvements, confidence-building, a changed outlook on life and future prospects, feelings of accomplishment and hope, and changes in the use of time and resources; increased civic participation and changes in social and community connections; and increased technological literacy (i.e., improved perceptions of technology as a means to achieve individual goals). Research conducted by the authors of this article and discussed in the final section of this paper shows similar gains in community-focused public library services.

While librarians engage in evaluation, their most common focus is on efficiency measurement (as seen in output measures). All types of libraries for nearly two decades have collected performance or output data. A more recent trend, brought about by the governmental pressure discussed above, has been the call for more accountability; in other words, answers to the question, "what difference does this agency make in terms of those who use it or depend on its services?" This call for accountability, determining the value of a program based on those that should benefit from it, requires incorporation of consumers into the formula. This has meant that evaluation research has incorporated more use of qualitative methods. Rossi et al. (1999) note that "incorporation of the consumer perspective into evaluation research has moved the field ... into the policy arena" (p. 13).

Increasingly, researchers within LIS have determined that the time is ripe for concerted efforts at developing appropriate evaluation research. In a monograph that resulted from a recent ASIS session on evaluation, Cliff Lynch wrote, "The answers we can supply today aren't good enough. We cannot currently measure outcomes and effects systematically with much success" (McClure & Bertot, 2001, p. 320). Lynch (1978) says that evaluation questions are amenable to the kind of intellectual effort that goes into studying societal "grand challenge" problems. He suggests that the "time is ripe for grand challenge problems in information science and networked information, particularly in areas related to evaluation, given the importance of the public policy choices we face today involving IT and the growing emphasis on accountability of our institutions" (McClure & Bertot, p. 314). In sum, recognition by librarians, funding agencies and researchers have created a climate for the kind of research that can help libraries more effectively articulate their contributions to society. It is premature to predict the outcomes of such research, but there is no question that it will change the way that librarians think about their practice and, as a result, will change the practice itself. As evaluators have known for decades, people do not evaluate what they do, they do what they evaluate.

Philip Doty, examining evaluation issues from a policy perspective, urges rethinking of current approaches to evaluation (referenced in McClure & Bertot, 2001). Doty sees "the birth of a richer and more complex policy analysis--one that is more catholic in its methods, more self-conscious, more sensitive to narrative and values, more ethnographically sophisticated, and more aware of the limitations of all its methodological resources" (p.230). Doty proposes that researchers "put their research emphasis on "(1) the user of networked technologies grounded in a social setting; (2) the naturalistic investigation of technologies' situated uses, meanings, and related practices; and (3) the achievement of democratic, participatory design and social relations" (McClure & Bertot, 2001, p. 247).

Carol Hert seeks "to provide a connection between user-centered evaluation processes and system design" (Hert, 2001, p. 165). She draws both on the theoretical approaches of information seeking and use and those of human-computer interaction to develop a framework for the development of metrics for user-centered evaluation. She recommends that evaluators develop metrics derived from theoretical conceptualizations, undertake constructivist approaches, "educate the design community about the potential of various kinds of user studies" (p. 168), and develop approaches to "transform results into design decisions" (Hert, 2001, p. 160). Saracevic (2000) developed a conceptual framework for evaluation that identifies five distinct areas that are the subject of evaluation of digital libraries; these include societal, individual, and institutional factors, as well as the interface and, of course, the content. Unruh et al. (2000) have introduced a framework for the evaluation of digital community information systems.

The work of Peter Hernon and his colleagues on service quality is a strong addition to the LIS's knowledge of evaluation. (Hernon & Dugan, 2002; Hernon & Nitecki, 2001; Hernon & Altman, 1996; Hernon & Altman, 1998). Peter Hernon and Ellen Altman's (1998) customer-centered approaches are designed to move librarians beyond what they call the "countables" (input and output measures). This study builds on previous work by LIS evaluation researchers and provides an extensive overview of service quality with a focus on the customer, a carefully chosen term. With some urgency Hernon and his coauthors consider the importance of understanding and developing their customer base at a time of rapid change and discuss a variety of approaches to measure service quality: "some academic administrators, members of city government, and others question the role of, and even the need for, a library; after all, they assume everything--or everything worth knowing--is, or will be, available on the information superhighway" (Hernon and Altman, 1998, p. 211).

Digital library researchers have begun to examine the social aspects of the design, use, and impact of information systems (Kling, 1997, 1999, 2000; Bishop et al., in press). Bishop and her colleagues (2000) argue for the inclusion of participatory action research in the study of the design, use, and impact evaluation of digital information systems (Bishop et al., 2000). Participatory action research demands relevant outcomes for marginalized members of society. It seeks to enhance the problem-solving capacities of local community members by actively involving them in every phase of research--from setting the problem to deciding how project outcomes will be assessed. In this approach, the intended users of a digital library participate as researchers, not subjects. Bishop et al. (2000) use scenarios developed by the target audience in the design and evaluation of services. They found that "scenarios empower potential users as initiators in the analysis of information about their expectations and requirements, rather than treating them as mere informants in the design process" (Bishop et al., 2000). They note that scenarios are needed to develop "a more complete picture of the social context of information-seeking and technology use for those marginalized groups who are often on the fringes of system design and evaluation" (Bishop et al., 2000).

In short, evaluation in LIS is in a state of creative turmoil realizing that current approaches and tools fail to reflect the changes brought about by the digital revolution. Evaluation issues are beginning to be addressed by researchers and at meetings initiated by federal agencies such as IMLS, major associations including a focused midyear meeting organized in 1999 by ASIS that spawned a monograph on the topic, and recent interdisciplinary meetings including the several workshops on evaluation of digital libraries developed by the European-based DELOS Network of Excellence on Digital Libraries (McClure & Bertot, 2001). (6) The field is closer now than ever to harnessing the energies of a critical mass of researchers interested in new approaches to evaluation that will incorporate the radically changed library environment.

When asked in 1999 how effective their current evaluation tools were in providing them data on the benefits of their community information services in their community, librarians resoundingly said that current tools were grossly inadequate (Durrance & Fisher-Pettigrew, 2002, p. 47). Evaluation tools and approaches should be able to provide tools that will be used by librarians in community settings to determine the effects of specific services because outcomes, while interesting in the aggregate to researchers and decision-makers, are most valuable to librarians as indicators of their contributions. In addition, evaluation can provide the tools that enable librarians to shape services based on a better understanding of the impacts of present service models and activities. The research below, informed by the use of contextual approaches, shows how librarians can identify a rich group of indicators of impact.


Contextual approaches have provided information behavior researchers much richer ways to understand people's use of information. These approaches can provide both researchers and librarians with an approach that can be used to develop a rich set of outcomes. Data for this section were drawn from findings of a recently completed research study entitled "How Libraries and Librarians Help: Context-Centered Methods for Evaluating Public Library Efforts at Bridging the Digital Divide and Building Community." The study was funded by IMLS, and the research was conducted by a team of researchers from the University of Michigan and the University of Washington. Researchers applied contextual approaches to this important evaluation question. This research, using qualitative approaches, empirically examined the use of specific community-focused services to develop context-sensitive approaches and instruments that identify outcomes. Services included those designed for immigrant populations, after-school community technology programs for teens, community networks, information and referral services, programs designed around ethnicity, and consumer health information services. Together these case studies: 1. contribute to the growing knowledge base that shows how library services affect lives, and 2. have resulted in the field's first set of contextual tools designed to identify outcomes of public library services. (7)

The "How Libraries and Librarians Help" study was built on the large body of information behavior research (cf. Pettigrew et al., 2001; Wilson, 1997; Dervin, 1992), and on research on people's use of everyday information (e.g., Harris & Dewdney, 1994; Savolainen, 1995). The contextual frame, drawn from the past research of the principal investigators and the frameworks of others, incorporates factors associated with the clientele as well as library-centered factors and those associated with staff (Pettigrew, 1999; Durrance, 1993; Durrance, 1994; Durrance & Pettigrew, 2001; Pettigrew et al., 1999). Research that we have conducted has been discussed extensively in other articles (Durrance & Pettigrew, 2000, 2001, 2002; Durrance & Fisher-Pettigrew, 2002; Pettigrew, Durrance, & Unruh, in press). The specific framework employed varied among the sites, but in general incorporated the following factors:

The clientele of the specific service. The individuals who participated in this study of community-focused services differed considerably. They were the study's primary informants. Researchers spoke to individuals and representatives of organizations who used or could use a particular service. Interviews focused on their needs and their experiences. Teens in the community technology programs came to gain technology skills and left with considerably more than that. Often, however, they indicated that they needed to overcome negative perceptions of librarians in order to be able to reap the benefits of the programs they participated in. Community agency staff and community nonprofits were almost worshipful of library staff who had over the years helped them better understand and participate in the community as information providers. All shared a concern that information was difficult for them to get and use. Immigrants in the study often spoke no English at all and required the assistance of staff who spoke their language or a language that they understood other than English. Because of these difficulties, most of the interviews with this population were conducted by library staff in their own language and not by project researchers.

The library and its service model. This research focused on a range of problem areas undertaken by public libraries: the problems faced by immigrant populations, the need to help bridge the digital divide for teens in poor communities, the need to meet community information needs, the need for multicultural opportunities, people's need for health information, and building electronic community. All had in common a community-focused model. However, each model is specific to the needs identified in the community. Data were collected by examining materials developed by the library, interviewing administrators and staff, and extrapolating model components from interviews with users of the model.

The set of activities designed to respond to the clientele. This research identified a varied set of activities that reflected a rich knowledge of the chosen primary clientele. Although the manifestations were different in each service, each of these community-focused services provided a warm, welcoming environment that fostered the activities associated with the service. Activities vary from providing what is perceived by users as a safe place to a variety of proactive approaches to increasing access to information.

Staff contributions. Each of these programs was headed by visionary staff who shaped the model, recruited the clientele, and developed the activities that shaped the outcomes of this community-focused service. Staff shared these characteristics: they were committed to their clientele, creative in their approach to providing service, entrepreneurial in their approaches to seeking additional resources, and were able to articulate some, but not all, of the outcomes of their services. Some were recruited to their jobs because of special skills that they brought to the service such as language facility, interest in the clientele, ability to teach, or knowledge of information technology.

The section below presents descriptions of three types of library programs in four libraries (the second program example examines two different approaches to presenting after-school community technology programs for children and teens). For all four case studies, we present the setting, the program, and what we consider "candidate outcomes," that have emerged from examining the contextual factors identified above. (8) At present, these are presented as candidate outcomes which will be further honed and tested by the study libraries. They are framed from the perspective of the users of the service.

In each case study, the italicized terms in the discussions of candidate outcomes represent major outcome categories. The case studies from which these data were drawn include, as well, a range of indicators of impact consisting of anecdotal data and specific comments from users that reflect the outcome. Headings are taken from site-specific codebooks developed in the course of analyzing the qualitative data collected as part of the investigation.

The contextual factors, as will be seen in the discussion below, result in services which have both similar and unique qualities. In addition, each service has a unique set of stakeholders (including the participants, interested agencies and organizations, and decision-makers within and outside of the library) who need to understand the impacts of the service. While the candidate outcomes presented here were identified by the study research team, they have not yet been tested. The next step in the outcome selection process will be for staff (and stakeholders) at each participating (case study) library to select and test the outcomes they seek to use from the candidate set.

Services to Immigrants by the Queens Borough (NY) Public Library

Queens. The 2000 census calculates the population of the Queens borough of New York City at 2.2 million, a 40 percent increase over 1990 statistics; 41.1 percent of the Queens population claim birth outside the United States, and, for the first time in the borough's history, more than half of Queens residents speak a language other than English. The 2000 census records a 50 percent increase in the Hispanic population, bringing the Hispanic community to account for a quarter of the borough population. In addition, African Americans make up 19 percent of Queens residents, while Asians constitute 17 percent. The borough also boasts the highest populations of a number of ethnic groups in the city, among them Asian Indian, Chinese, Korean, Filipino, Bangladeshi, Pakistani, and Colombian communities.

Queens Borough Public Library (QBPL) Service Model and Activities. Queens Borough Public Library (QBPL), a system with sixty-three branches and six Adult Learner Centers, serves the most ethnically diverse county in the United States. Queens customers represent over 120 countries and 160 nationalities, and speak over 100 languages. To ensure that branch programs and services appropriately reflect local constituencies, QBPL employs a full-time demographer to analyze data from multiple sources. The demographer also produces color-coded maps of Queens' communities using Geographic Information System (GIS) software. Demographic analysis and visualization allows the Queens Borough Public Library to take a current snapshot of the community, as well as to project future demographic shifts. The library's New Americans Program seeks to help to transition immigrants into American life. It encompasses multilingual Web site management; multilingual, multicultural, and multimedia collection development; mail-a-book programs in six to seven languages; and two streams of public programming: cultural arts programs and coping skills workshops. Staged throughout the borough, the library's cultural arts programs celebrate a variety of cultures in multiple languages. Queens' coping skills workshops address topics in response to the needs of Queens' immigrant populations. The library hosts coping skills workshops in Spanish, Chinese, Korean, and Russian, but, as warranted, the library extends this programming to include other languages, including Haitian, Creole, Polish, Hindi, and Bengali. The Adult Learner Program of the Queens Public Library, also designed to meet the needs of immigrants, serves over 6,000 students a year in tailored settings. In addition to its specialized curricula, the program supports small group classes, conversation groups, and technology-assisted instruction. Its English for Speakers of Other Languages (ESOL) Program offers ninety-two classes in two terms per year in locations throughout the borough. These classes are always oversubscribed and are on a first-come, first-served basis.

Candidate Outcomes for Immigrants. The QBPL services demonstrate the range of outcomes that librarians can expect from library services designed to reach and to serve immigrant communities. Outcomes of the program, and their indicators of impact, are reflected as changes in skills and abilities, perceptions and attitudes, and changes in behavior. The following italicized indicators of outcome show how immigrants and their families benefit from the New Americans and Adult Learner Programs offered at QBPL. Outcomes originate with immigrants' discovery of the library and in their appreciation of its role as a safe and welcoming place through which to adapt to their new environment. Once in the library, immigrants begin to build information literacy skills as they learn what the library can do for them and how to exploit its resources. Their transition further advances as immigrants effectively interact with staff, interactions--often in the immigrants' native tongues--that support relationship-building and thus help to integrate the immigrant into the social fabric of the community. In turn, immigrants bridge cultural landscapes as the library allows them to maintain connections to their native culture, introduces them to foreign cultures, and links them to their new American culture and community. Once equipped with an appreciation of resources and of context, immigrants gain new skills and knowledge that allow them to become more independent as they seek to improve their lives and the lives of their families. In the process, immigrants develop a positive impression of the library and share news of their experience with family and friends, returning benefits to the library itself.

After-School Public Library Community Technology Programs in Austin, TX, and Flint, MI

Two case studies focused on community technology programs. The service model and clientele varied considerably; these variations influenced the outcomes experienced by participants.

Austin. The mission of the Austin Public Library is "to provide open access to information and to promote literacy, love of reading, and lifelong learning opportunities for all members of the community." Wired for Youth (WFY) is an after-school drop-in program aimed at providing computers to youth in or near low-income areas in selected library branches. The goal of the program is to provide facilitated Internet and computer access to Austin youth, in particular those at-risk. WFY is a nonstructured computer technology program for young teens and preteens based on computer self-use. The WFY computers, located in public spaces in branch libraries, are designated for youth use, only. Computers are loaded with kid-friendly educational software and Internet sites and computer games. They are available on a first-come, first-served basis and use is generally limited to thirty minutes due to heavy demand.

WFY librarians provide basic technology skills, use technology as a tool to help make students feel comfortable in the library, make the library a warm, inviting place, and provide a place for homework and access to tutoring assistance in most branches. Each librarian acts as a facilitator, a reference librarian, and an educator (primarily for one-on-one, as-needed instruction). WFY librarians help students configure e-mail accounts and enroll in virtual pen pal programs with kids in other countries, conduct selected training sessions, showcase student work, engage in a variety of trust-building activities, and help students complete small tasks with attainable goals on the computer. WFY staff "triage" children coming through the door after school, directing them to various activities, and developing activities for students who are waiting for computers.

Flint. Flint, MI, is a rust-belt community that has experienced economic downturns in recent decades, including the exit of the city's major employer, General Motors. The city and school system struggle with scarce resources because of the declining tax base. The city is about 53 percent African American, 41 percent white. Community Information Agents Online (CIAO), an intensive after-school community technology program requiring five to six hours per week for the school year, sought to foster teen civic engagement by giving the teen participants the skills they needed to help a community organization as it developed a Web presence. Thus, students needed to increase both their knowledge of the community and develop a range of technology skills. To do this, participants were required to spend one afternoon a week and a Saturday morning engaged in active learning and site development.

By the end of the program (an academic year), teenagers had adopted an array of computing technologies to support their project work. Hardware, like digital cameras and scanners, and software, including word processors, graphics editors, browsers, and Web page editors, were among the tools the teenagers used each session. Students were expected to gain the skills needed to develop the content for a Web site by working with a community organization to interview staff and edit content based on staff input. The program focused on positive aspects of their community and encouraged students to learn more about their community and seek out community assets. Flint Public Library staff held periodic public celebrations designed to foster pride, self-confidence, and presentation skills of the participants as well as to have them exhibit their work. Students and staff invited parents, nonprofit organizations, local community leaders, and the local news media, including the local television station, to these events that were always accompanied by refreshments. Students had opportunities to present their work briefly to the entire group and demonstrate it at one of the computer stations in the lab.

Candidate Outcomes for Youth. This study of after-school community technology programs in Flint and Austin shows that such public library programs can have strong impacts on the young people who use them that go well beyond the technology skills the participants initially seek. Given the differences in the models and the fact that in determining program impact a one-size-fits-all approach does not apply, outcomes are similar, but not the same. They may vary both in kind and intensity while the overall framework may be similar.

Austin Candidate Outcomes. WFY Centers have become a "safe place" for kids after school, and many stay until the library closes. Youth interviewed at the library told us how much they valued the library as a safe, welcoming place where they could do homework, work and play on the computers, and work with others. Youth reported that they had increased their technology skills. Participating in WFY has given Austin children the opportunity to increase their communication and self-expression skills, and has fostered their ability to learn. Perception and attitude changes such as increasing trust of library staff are also important outcomes of this program. For kids who have negative perceptions of adults in their lives, changed perceptions are necessary before they can trust an adult. WFY librarians noticed that over time the program helped build the confidence of some children and broaden children's world-view. This research showed that some benefits extend beyond the participant to families and friends.

Flint Candidate Outcomes. The teens that participated in the community technology program in Flint gained an extensive range of technology skills. Gaining these skills provided these teens with a personal cache and recognition. Likewise they developed communication skills, including the ability to express themselves and to communicate more effectively with people they didn't know well. Flint CIAO participants made a variety of learning gains. Participants became actively engaged in their own learning and gained knowledge of their community. Building on the previous gains it appears that some participants became more actively engaged. CIAO staff and participants both noted changes in participant perceptions and attitudes. Leaders noticed increasing youth trust in staff. Participants, in addition, developed a sense of responsibility for their work and showed pride in their accomplishments. Participants and staff noted changes in their social behavior and building social capital. These changes can be seen in new social patterns of engagement, relationship-building, and expanding social networks. Participants valued the networks of the librarians in the community. The fact that students were associated with the library opened doors to community organizations generally closed to teenagers. Finally, a group of family and community outcomes, including sharing of knowledge gains with family members, teachers, and others, appear to extend beyond the teens to their families and neighborhoods.

Peninsula Library System's Community Information Program Information and Referral Service

The Community Information Program Model. The Peninsula Library System (PLS), headquartered in San Mateo, CA, is a consortium of thirty-four public and community college libraries that serve multiple communities in the area. Its mission states that PLS "strengthens local libraries through cooperation, enabling them to provide better service to their diverse communities." PLS's twenty-five-year-old Community Information Program (CIP) seeks to provide accurate and up-to-date information to social service agencies and library staff through its database and a variety of publications. The database contains over 3,000 detailed profiles and contact information for nonprofit and government agencies in the county that provide direct services to the public. CIP's primary clientele are the social service agencies who use either the database or the many specialized publications and services, such as customized map development, developed by CIP staff. Relationships with the clientele have developed over time and agencies indicate that CIP provides them with both community information and the ability to disseminate information about their own agency's activities to potential clientele. CIP is staffed by a group of librarians who work for the Peninsula Library System, but are housed with other county human service agencies, providing the benefits of proximity between staff and clientele. CIP staff focus both on database development and maintaining contact with their clientele; they work collaboratively with many community organizations. Staff skills include: public speaking skills and training abilities; some staff have gained skills in the use of special purpose software such as geographic information systems (GIS). In addition to providing products and services directly related to the database, CIP has taken a leadership role as an information provider within the nonprofit community. CIP hosts regular meetings for service providers to meet and exchange ideas and regular training sessions to orient nonprofit staff to community resources.

Candidate Outcomes for Community Organizations. A synergistic cycle of community outcomes appears to result from the carefully crafted strategies and activities devised by CIP starting with the solid framework which rests on the CIP community information database. The reliable and up-to-date information provided by the CIP and the connections that the program makes between community organizations lead to larger outcomes. The research team identified six categories of impact on area human services organizations, starting with the most basic--increased knowledge of the community. This gain is the direct result of a variety of information products that result from the major CIP database. Secondly, CIP staff foster shared information and increased communication. Information-sharing and its corollary, increased communication among organizations, are fostered through a variety of CIP outreach mechanisms such as orientation sessions and bimonthly meetings. These and additional organization development activities, in turn, lead to the third group of outcomes, increased coordination and collaboration among the target organizations in the community. It is not surprising that the fourth and fifth categories--increased organizational capacity and the resulting improved delivery of services--show a synergy that builds on the more basic strategies, and, of course, the resulting outcomes. Finally, it appears that these outcomes lead to a community-wide set of impacts; by employing a set of diverse strategies, CIP lays the foundation for a more effective community.


Contextual approaches based on qualitative studies, as we have seen above, produce rich outcomes. The outcomes discussed above represent only some of those identified through this IMLS-funded research. A theme that crossed case studies showed that librarians act to bridge the digital divide and increase technological literacy, and in addition facilitate a variety of personal efficacy and affective outcomes, including general life improvements, confidence-building, a changed outlook on life and future prospects, feelings of accomplishment and hope, and changes in the use of time and resources. In addition to the community-focused outcomes explored above, we have found that community networks bring similar empowering benefits for organizations that reach and serve a variety of audiences (Durrance & Pettigrew, 2001; Durrance & Fisher-Pettigrew, 2002). Other studies have shown that many libraries contribute job-related benefits to citizens (Durrance, 1993, 1994). The recent book Libraries and Democracy focuses extensively on the roles that libraries and librarians play in a civil society including identifying examples that show how libraries contribute to increased civic participation and changes in social and community connections (Kranich, 2001). The Counting on Results study, based on librarian-suggested outcomes, identified some viable outcomes of public library service responses (Steffen et al., 2002; Steffen & Lance, 2002; Lance et al., 2002).

The initial work done in this area has only begun to identify outcomes that reflect the contributions of public libraries. Most of the impacts of public library services remain largely undocumented, and research that focuses on the differences that libraries and librarians make in their communities is at the stage that research on the reference interaction was in the early 1970s. At the beginning stages of research on reference, no one could have predicted how it would build on itself, resulting in the rich and varied contributions that research has made to understanding that seemingly simple interaction. Had some external factor frozen this research on the reference interaction at that time, researchers and the profession would not have the rich knowledge base that has built up over the past thirty years. However, pressures for immediate accountability, discussed early in this article, do exist; these accountability pressures and the need to codify outcomes could serve to limit the move toward further identification of impacts that will make sense to citizens, policy-makers, and social science researchers. Decision-makers must resist a rush to develop a comprehensive "set" of outcomes that can be tested across libraries and instead focus on helping librarians more effectively identify and articulate both their value and the contributions of the institution. This of course will also mean testing candidate outcomes that will reflect the contextual factors of importance to specific services at the local level. Likewise, librarians must immediately take action to understand the new evaluation environment and the value of determining the outcomes of their services. (9)

Researchers and librarians will need to work together to articulate the outcome patterns that occur across services and to assist in the important definition and conceptual development likely to occur as librarians' acceptance and use of this approach to evaluation grows. Academic librarians seeking to determine the impact of library services and information literacy approaches have already begun to move into this stage of development. Public library researchers and librarians should be prepared to move into a period of "develop[ing] definitions and concepts that support more effective communication and use" of outcomes such as described by Kyrillidou (2002). This period will be followed by more research aimed at identifying relevant outcomes that actually build on previous work; this will likely be followed, as Kyrillidou suggests, by definition tightening, testing, and honing data collection approaches. It is difficult at this stage to predict the trajectory of this research. If it is shaped by external frameworks that speak to decision-makers, government agencies, researchers, and citizens as discussed in the article's opening paragraphs, there is more chance the research will provide librarians with the tools they need to determine and articulate their contributions and those of libraries.


(1.) For more information about this research, see:

(2.) The coauthor of this paper has also published under the name of Karen E. Pettigrew.

(3.) Evanston Public Library.

(4.) Boulder Public Library.

(5.) Los Angeles Public Library.

(6.) See:

(7.) These tools, entitled Putting outcome evaluation in context: A toolkit, can be found on the Internet. See:

(8.) See for additional case studies, methodological approaches, and related articles.

(9.) Putting outcome evaluation in context: A toolkit, index.html, provides an introduction to outcome evaluation as well as a multistep approach to identifying outcomes in a particular setting using contextual approaches.


Bates, M. J. (1999). The invisible substrate in information science. Journal of the American Society for Information Science, 50(12), 1043-1050.

Benner, P. (1984). From novice to expert: Excellence and power in clinical nursing practice. Menlo Park: Addison-Wesley Pub. Co.

Benton Foundation & Libraries for the Future. (1997). Local places, global connections: Libraries in the digital age. Washington, D.C.: Communications Development.

Bertot, J. C., McClure, C. R., & Ryan, J. (2001). Statistics and performance measures for public library networked service. Chicago: American Library Association.

Bishop, A. P., Mehra, B., Bazzell, I., & Smith, C. (2000). Socially grounded user studies in digital library development. First Monday 5(6). Retrieved January 15, 2003, from http://

Bishop, A. P., Van House, N. A., & Buttenfield, B. (Eds). (In Press). Digital library use: Social practice in design and evaluation. Cambridge: MIT Press.

Chatman, E. A. (1996, March). The impoverished life-world of outsiders. Journal of the American Society for Information Science, 7(3), 193-206.

Chatman, E. A. (1999, March). A theory of life in the round. Journal of the American Society for Information Science, 50(3), 207-217.

Chelton, M. K., Brody, R., & Crosslin, D. (2001). The adoption searchers' use of libraries: A pilot descriptive study. Reference & User Services Quarterly, 40(3), 264-273.

Childers, T., & Post, J. A. (1975). The information-poor in America. Metuchen, NJ: Scarecrow Press.

Creswell, J. w. (1998). Qualitative inquiry and research design: Choosing among five traditions. Thousand Oaks, CA: Sage Publications.

Denzin, N., & Lincoln, Y. (2000). Handbook of qualitative research. Thousand Oaks, CA.: Sage Publications.

Dervin, B. (1992). From the mind's eye of the user: The sense-making qualitative-quantitative methodology. In J. D. Glazier & R. R. Powell (Eds.), Qualitative research in information management (pp. 61-84). Englewood: Libraries Unlimited.

Dervin, B., & Clark, K. (1987). ASQ: Asking significant questions: Accountability and needs assessment tools for public libraries: Technical report. Belmont: Peninsula Library System.

Dervin, B., & Dewdney, P. (1986). Neutral questioning: A new approach to the reference interview. RQ 25(4), 506-513.

Dervin, B., Zweizig, D., Banister, M., Gabriel, M., Hall, E. P., & Kwan, C. (1976). The development of strategies for dealing with the information needs of urban resident: Phase I: The citizen study. Final report of Project L0035J to the U.S. Office of Education. Seattle: University of Washington, School of Communications. ERIC: ED 125640.

Dewdney, P. (1986). The effects of training reference librarians in reference skills: A field experiment. Unpublished doctoral dissertation, the University of Western Ontario, London.

Dewdney, P., & Michell, G. (1996). Oranges and peaches: Understanding communication accidents in the reference interview. RQ 35(4), 520-536.

Dewdney, P., & Ross, C. (1994). Flying a light aircraft: Reference service evaluation from a user's viewpoint. RQ 34(2), 217-230.

Durrance, J. C. (1984). Armed for action: Library response to citizen information needs. New York: Neal-Schuman.

Durrance, J. C. (1989). Reference success: Does the 55% rule tell the whole story? Library Journal, 114(7), 31-36.

Durrance, J. C. (1991a, March-April). Public libraries and career changers: Insights from Kellogg funded sources. Public Libraries, 30(2), 93-100.

Durrance, J. C. (1991b). Kellogg funded education and career information centers in public libraries. Journal of Career Development, 18, 11-17.

Durrance, J. C., & Fisher-Pettigrew, K. E. (2002) Factors influencing changes in approaches to public sector evaluation. Reference and User Services Quarterly, 42(1), 43-53.

Durrance, J. C. (1994). Meeting community needs through job and career centers. New York: Neal-Schuman.

Durrance, J. C. (1995). Factors that influence reference success: What makes questioners willing to return? Reference Librarian, 49/50, 243-265.

Durrance, J. C., & Pettigrew, K. E. (2000). Community information: The technological touch. Library Journal, 125(2), 44-46.

Durrance, J. C., & Pettigrew, K. E. (2001). Toward context-centered methods for evaluating public library networked community information initiatives. First Monday. Retrieved January 15, 2003, from

Durrance, J. C., & Pettigrew, K. E. (2002). Online community information: Creating a nexus at your library. Chicago: American Library Association.

Durrance, J. C., Savage, K. M., Ryan, M. J., & Mallinger, S. M. (1993). Serving job seekers and career changers: A planning manual for public libraries. Chicago: American Library Association.

Durrance, J. C., & Schneider, K. G. (1996). Public library community information activities: Precursors of community networking partnerships. Ann Arbor: School of Information, University of Michigan. Retrieved March 5, 1997, from taospaper.html.

Dyson, L. S. (1992). Improving reference services: A Maryland training program brings positive results. Public Libraries, 31(5), 284-289.

Harris, R., and Dewdney, P. (1994). Barriers to information: How formal help systems fail battered women. Westport: Greenwood Press.

Hennen, T. (2002). Great American public libraries: The 2002 HAPLR rankings: The eagerly awaited--if overdue--measure of the nation's public libraries. American Libraries, 33(9), 64-68.

Hernon, P., & Altman, E. (1996). Service quality in academic libraries. Norwood: Ablex.

Hernon, P., & Altman, E. (1998). Assessing service quality: Satisfying the expectations of library customers. Chicago: American Library Association.

Hernon, P., & Dugan, R. E. (2002). An action plan for outcomes assessment in your library. Chicago: American Library Association.

Hernon, P., & McClure, C. R. (1986). Unobtrusive reference testing: The 55 percent rule. Library Journal, 111(7), 37-41.

Hernon, P., & Nitecki, D. (2001). Service quality: A concept not fully explored. Library Trends, 49(4), 687-708.

Hert, C. (2001). User-centered evaluation and its connection to design. In C. R. McClure & J. C. Bertot (Eds.), Evaluating networked information services: Techniques, policy, and issues (pp. 155-173). Medford: Information Today.

Himmel, E., & Wilson, B. (1998). Planning for results. Chicago: American Library Association.

Institute for Museum and Library Services. (2000). Perspectives on outcome based evaluation for libraries and museums. Washington, D.C.: IMLS.

Institute for Museum and Library Services. (2001). New directives, new directions: Documenting outcomes in IMLS grants to libraries and museums. Washington, D.C.: IMLS. Retrieved on March 18, 2003, from

Kling, R. (1999). What is social informatics and why does it matter? D-Lib Magazine, 5(1).

Kling, R. (2000). Learning about information technology and social change: The contributions of social informatics. Information Society, 16(3), 1-37.

Kling, R., and Elliot, M. (1997). Digital library design for organizational usability. Journal of the American Society for Information Science, 48(11), 1023-1035.

Kranich, N. (Ed.). (2001). Libraries and democracy: The cornerstone of liberty. Chicago: American Library Association.

Kuhlthau, C. (1991). Inside the search process: Information seeking from the user's perspective. Journal of the American Society for Information Science, 42(5), 361-371.

Kuhlthau, C. (1993). Seeking meaning: A process approach to library and information services. Norwood, NJ: Ablex.

Kuhlthau, C. (1994). Teaching the library research process. Metuchen, NJ: Scarecrow Press.

Kuhlthau, C. (1999). The Role of experience in the information search process of an early career information worker: Perceptions of uncertainty, complexity, construction, and source s. Journal of the American Society for Information Science, 50(5), 399-412.

Kuhlthau, C. (2001). Information search process of lawyers: A call for just for me information services. Journal of Documentation, 57(1), 25-43.

Kyrillidou, M. (2002). From input and output measures to quality and outcome measures, or, from the user in the life of the library to the library in the life of the user. Journal of Academic Librarianship, 28(1-2), 42-46.

Lance, K. C., Steffen, N. O., Logan, R., Rodney, M. J., & Kaller, S. (2002). Counting on results: New tools for outcome-based evaluation of public libraries. Final Report. Washington, D.C.: IMLS.

Lincoln, Y. (2002). Insights into library services and users from qualitative research. Library and Information Science Research, 24(1), 3-16.

Lynch, M. J. (1978). Reference interviews in public libraries. Library Quarterly, 48(2), 119-42.

Mark, J., Cornebise, J., & Wahl, E. (1997). Community technology centers: Impact on individual participants and their communities. Retrieved January 3, 2002, from http://

McClure, C., & Bertot, J. C. (2001). Evaluating networked information services: Techniques, policy and issues. Medford, NJ: American Society for Information Science & Technology.

McClure, C., Owen, A., Zweizig, D. L., Lynch, M. J., & Van House, N. A. (1987). Planning and role setting for public libraries: A manual of options and procedures. Chicago: American Library Association.

Mehra, B., Bishop, A. P., & Bazzell, I. (2000). The role of use scenarios in developing a community health information system. Bulletin of the American Society for Information Science, 26(4). Retrieved January 15, 2002, from

Michell, G., & Dewdney, P. (1998). Mental models theory: Applications for library and information science. Journal of Education for Library and Information Science, 39(4) 275-281.

Michell, G., & Dewdney, P. (2002). Lost in reference: Mental models of the public library. Unpublished manuscript.

Molz, R. K., & Dain, P.. (1999). Civic space/cyberspace: The American public library in the information age. Cambridge: MIT Press.

Multnomah County Auditor's Office. (2000). Annual report. Portland, OR: Multnomah County Auditor's Office. Retrieved, February 18, 2001, from aud/.

Nelson, S. (2001). The new planning for results: A streamlined approach. Chicago: American Library Association.

Nelson, S., Altman, E., & Mayo, D. (2000). Managing for results: Effective resource allocation for public libraries. Chicago: American Library Association.

Norman, D. A. (1983). Some observations on mental models. In D. Gentner & A. L. Stevens (Eds.), Mental Models. Hillsdale: Lawrence Erlbaum Associates.

Osborne, D., & Gaebler, T. (1992). Reinventing government: How the entrepreneurial spirit is transforming the public sector. Reading: Addison-Wesley Pub. Co.

Palmour, V., Bellassai, M., & DeWath, N. (1980). A planning process for public libraries. Chicago: American Library Association.

Patton, M. Q. (1997). Utilization-focused evaluation. Thousand Oaks: Sage Publications.

Pettigrew, K. E. (1999). Waiting for chiropody: Contextual results from an ethnographic study of the information behavior among attendees at community clinics. Information Processing & Management, 35(6), 801-817.

Pettigrew, K. E. (2000). Lay information provision in community settings: How community health nurses disseminate human services information to the elderly. Library Quarterly, 70, 47-85.

Pettigrew, K. E., Durrance, J. C., & Unruh, K. T. (2002). Facilitating community information-seeking using the Internet: Findings from three public library-community network systems. Journal of the American Society for Information Science & Technology, 53(11), 894-903.

Pettigrew, K. E., Durrance, J. C., & Vakkari, P. (1999). Approaches to studying public library-networked community information initiatives: A review of the literature and overview of a current study. Library and Information Science Research, 21(3), 327-360.

Pettigrew, K. E., Fidel, R., & Bruce, H. (2001). Conceptual frameworks in information behavior. In M. E. Williams (Ed.), Annual Review of Information Science and Technology 35 (pp. 43-78). Medford: American Society for Information Science & Technology, and Information Today.

Preer, J. (2001). Where are libraries in Bowling Alone? American Libraries, 32(8), 60-63. Putnam, R. D. (1995). Bowling alone: America's declining social capital. Journal of Democracy, 6(1), 65-78.

Putnam, R. D. (2000). Bowling alone: Collapse and revival of American community. New York: Simon & Schuster.

Radford, M. L. (1999). The reference encounter: Interpersonal communication in the academic library. Chicago: Association of College and Research Libraries.

Ross, C. S., & Dewdney, P. (1998). Communicating professionally: A how-to-do-it manual for library applications. 2nd ed. New York: Neal-Schuman.

Rossi, P., Freedman, H. E., & Lipsey, M. W. (1999). Evaluation: A systematic approach. Thousand Oaks: Sage Publications.

Ross, C. S., Nilsen, K., & Dewdney, P. (2002). Conducting the reference interview: A how to do it manual for librarians. New York: Neal Schuman.

Saracevic, T. (2000). Digital library evaluation: Toward an evolution of concepts. Library Trends, 49(2), 350-369.

Savolainen, R. (1993). The sense-making theory: Reviewing the interests of a user-centered approach to information seeking and use. Information Processing & Management, 29(1), 13-28.

Savolainen, R. (1995). Everyday life information seeking: Approaching information seeking in the context of "way of life." Library and Information Science Research, 17(3), 259-294.

Sheppard, B. (2000). Outcome based evaluation: Showing the difference we make: Outcome evaluation in libraries and museums, Retrieved July 2, 2002, from current/crnt_obe.htm.

Steffen, N., Lance, K.C., & Logan, R. (2002). Time to tell the whole story: Outcome-based evaluation and the Counting on Results Project. Public Libraries, 41, 222-228.

Steffen, N. & Lance, K.C. (2002). Who's doing what: Outcome-based evaluation and demographics in the Counting on Results Project. Public Libraries, 41, 271-279.

Taylor, R. (1968). Question negotiation and information seeking in libraries. College and Research Libraries 29(3), 178-194.

Unruh, K, Pettigrew, K., & Durrance, J. C. (2002). Towards effective evaluation of digital community information systems. Annual Conference of the Association for Information Science & Technology. Philadelphia.

Vakkari, P., Savolainen, R., & Dervin, B. (Eds.). (1997). Information seeking in context. London: Taylor Graham.

Van House, N., Lynch, M. J., McClure, C. R., Zweizig, D., & Rodger, E. J. (1987). Output measures for public libraries: A manual of standards and procedures. 2nd ed. Chicago: American Library Association.

Van Slyck, A. (1995). Free to all: Carnegie libraries and American culture, 1890-1920. Chicago: University of Chicago Press.

Warner, E. S., Murray, A. D., & Palmour, V. E. (1973). Information needs of urban residents. Baltimore: Regional Planning Council.

Wilson, T. D. (1997). Information behaviour: An interdisciplinary perspective. Information Processing & Management, 33(4), 551-572.

Wilson, T. D. (1999). Models in information behaviour research. Journal of Documentation, 55(3), 249-270.

Joan C. Durrance, School of Information, University of Michigan, 3084 West Hall Connector, 550 E. University, Ann Arbor, MI 48109-1092; Karen E. Fisher, Assistant Professor, The Information School, University of Washington, Box 352840, Seattle, WA 98195-2840

JOAN C. DURRANCE is a Professor at the University of Michigan School of Information (SI). She conducts research and teaches in the areas of information use, professional practice of librarians, and information services. Prof. Durrance's recent research includes How Libraries and Librarians Help and a new grant, Approaches for Understanding Community Information Use: A Framework for Identifying and Applying Knowledge of Information Behavior in Public Libraries. Both of these studies have been funded by IMLS and conducted with co-principal investigator Karen E. Fisher of the University of Washington. Other recent research includes an earlier project funded by IMLS and additional studies funded by the W. K. Kellogg Foundation. Her research on the reference interview was recognized with an R. R. Bowker-Isadore Mudge Award. Prof. Durrance's research has resulted in a number of articles and books. Her most recent book, coauthored with Karen E. Fisher, is entitled Online Community Information: Creating a Nexus at Your Library (ALA Editions, 2002).

KAREN E. FISHER is an Assistant Professor at The Information School, University of Washington where she teaches information behavior, community analysis, and qualitative research methods. She holds a B.A. (English and Russian, 1989) from Memorial University of Newfoundland, and M.L.I.S. (1991) and Ph.D. in library and information science (1998) degrees from the University of Western Ontario. In 1998-99 she held a postdoctoral fellowship, funded by the Social Sciences and Humanities Research Council of Canada, at the University of Michigan School of Information.
COPYRIGHT 2003 University of Illinois at Urbana-Champaign
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2003, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Author:Fisher, Karen E.
Publication:Library Trends
Geographic Code:1U2NY
Date:Mar 22, 2003
Previous Article:Improving health care through information: research challenges for health sciences librarians.
Next Article:Public library service to children and teens: a research agenda.

Related Articles
Essential connections: school and public libraries for lifelong learning.
Becoming a chief librarian: an analysis of transition stages in academic library leadership.
Migrating to public librarianship: depart on time to ensure a smooth flight.
The effects of technology on midcareer librarians.
AskNow! Online answers Australia-wide.
Working together: librarian-faculty partnerships.
Collaboration and marketing ensure public and medical library viability.
Consumer health information from both sides of the reference desk.

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters