Utilitarian Decisions in the Environmental Health Sciences.
The ways in which the federal government uses the environmental health sciences as assessment tools are examined in this paper from two perspectives:
1. the perspective of the government scientist/project director, whose objective is quality science with value, and
2. the perspective of the government manager who must deal with budget and politics, who is responsible for addressing stakeholder concerns, and who must spend taxpayer money wisely
The scientist is a health risk professional eager to establish and maintain an objective, accurate, provable hypothesis. The manager wants to provide answers and solutions that are understood and accepted by stakeholders (especially those certain that they have been affected).
Often in government work, one individual is both scientist and manager. This dual role is indeed a challenge, because science and politics may have different destinations. Science wants to prove that a correlation does or does not exist. Politics wants to solve the problem. These goals aren't the same. People who are convinced that they, or the environment, have been harmed, would like science to show proof for that harm. When science is unable to prove a correlation, it can only conclude that a correlation could exist. Unfortunately, "could" quickly advances to "does" in the mind of the concerned individual. This is understandable. Then the press (totally ignorant but declaring by inference total accuracy) turns the negative into an absolute and convinces the rest of us. The question then becomes "Can science be properly managed by the federal government in a public arena saturated with conflicting agendas and predetermined ideas about outcomes?"
Human health issues are emotional. Contamination implies environmental degradation, health threat, sickness, and death. Risk generates fear, and fear calls for the strongest force to eliminate the risk. Sciences such as dose reconstruction, risk assessment, health studies, the study of carcinogenicity, consequence assessment, epidemiology, and toxicity may be appropriate to the type of analysis required, but they can't always provide definitive conclusions. In other words, highly quantitative and accurate investigations may produce good science with no utilitarian value.
A utilitarian perspective means applying characteristics sorely needed in environmental and human health sciences work - namely, efficacy, functionality, and substantive value. Yet it is difficult to maintain a utilitarian perspective in the face of fear generated by thoughts of pollution, toxins, ecological risk, cancer, birth defects, and death.
Data and the Scientist
Public concern is often the fuel for science, and science has a way of stretching to meet those concerns; an important goal of science is its own advancement. When data are limited, good science is unable to reconstruct dose or assess risk. Because the public has a valid concern, however, the manager may ask the scientist to reach into the toolbox and grab a complex instrument - often without weighing the likelihood of solving the problem. Even when the scientist recognizes up front that a successful study will not change health risk or ecological threat, stakeholder concern alone may authorize the study. Often, a lengthy and expensive project will conclude that further study is needed only because a conclusive determination could not be made. That same conclusion often can be drawn before the study is begun, and the need should be addressed at that time.
New firms with catchy names are continually being spawned by the "new science" seed. The inherent difficulty of these sciences has been acknowledged by the development of another new science called uncertainty analysis. Uncertainty analysis is the science of measuring how ambiguous the results of the other sciences are. Thus, science, which sets out to be as exact as possible, finds itself pursuing the exactness of inexactness. Will the next new science be "certainology," the certainty of uncertainty? Or will we realize it's time to take account of what we should expect of science?
Adding to the problem is that scientists, managers, and politicians tend to work in isolation from each other. When they do pull together, the question is often "How can we satisfy the public outcry?" and the easy answer is "Grab the best tool we have and carry out a study." The question should be "How can we best understand the stakeholders' concern?" The answer should be "Let's work with them to clarify the problem and develop a meaningful plan of action that will answer their concern." Notice that the plan of action isn't to "address or solve," the concern. A concern can be addressed and studied for years with millions of dollars but never solved. Concerns can, however, be answered. The answer may be that a multimillion dollar study will likely provide a definitive cause and effect, and that the study, though expensive, has value. Or, the answer may be that the use of this tool will only bring us a little closer to understanding the complexity of the problem because the tool simply can't yield conclusive results either way Thus, the benefit does not justify the cost of this particular study
The Public and the Scientist
The scientist and the public are often seen as oil and water simply because they are wrongly stereotyped: The public's intelligence is overlooked, and the scientists are overlooked as stakeholders. The average person is intelligent, although he or she may not understand isotopic ratios and picocuries. Scientists are not remote entities with their heads stuck in a cloud. One reason for misunderstandings is that amongst the public, scientists experience the apple barrel effect. The scientist dips into the barrel, encounters imprudence, and considers the entire barrel a roadblock to understanding. In addition, the intellect in the barrel can be obscured by the loudness of the imprudence that tries to speak for the entire barrel. Many of the healthy apples hide at the bottom of the barrel to avoid getting bruised or abused.
The ensuing confusion overwhelms the scientist, frustrates the public, puts the manager in a defensive posture, and ends the rationality of the system. Consequently, the public assumes the scientist doesn't understand the public; the scientist assumes the public can't be reached because it doesn't align with or care about the science; and the manager becomes a "big-brother bad guy" accused of skirting the facts and hiding behind excuses. Add the hype-fueled reporter and a press with no concept of factual investigation, and you have chaos.
Budget and the Manager
In the confusion, the benefit-cost ratio may be obscured just when a government manager needs it most. The manager may be asked or forced to fund grants defined according to a scientists or politician's interpretation of public need. Once funds are allocated, little accountability remains with the manager; with the grant goes the responsibility. When the grant project is underway, its direction is unlikely to change - even if the need for such a change becomes obvious. In the meantime, the public continues to ask for what it wants and fairly criticizes the study it didn't ask for.
One common approach is to form a committee or panel of experts and make them accountable. This approach can backfire. Although a committee may have a name, it often has no face. Chair and members come and go, and a member may see the "committee" as responsible, rather than the individual, thus deflating or destroying accountability altogether. The committee may become so wrapped up in science and politics that the objective of the study is lost; it's not unusual for science to default to politics. Also, once the committee gets rolling, it may be impossible to change the direction or scope of the study. Good science may produce no conclusive answers, and the committee may try to skirt criticism with a recommendation for additional study. One may look back on a completed study and wonder who was in charge and what the public purchased.
Trust, or lack thereof, plays a role. The public seems unable to look beyond past government errors and to trust government scientists or managers. It's a sad and costly state of affairs when the word government is synonymous with distrust. In an attempt to regain trust, the government can find itself spending money on science to solve a public perception problem that has no scientific resolution. Good intent and good science unfortunately can go awry and foster more distrust.
Because the public is not satisfied that its concerns are being adequately addressed even when more studies are funded, the scientists and managers view the public as unreasonable. The public condemns the government for wasting money and not answering its concern. In the interim, the situation escalates, amplitude overrules sensible science, and more projects that should never have been funded are approved. Such studies not only waste money and fail to solve the problem, but also divert funds from other needy areas, especially research. The scientist, the manager, and the public all get lost in a forest of confusion, false assumptions, politics, and unsupported decisions. The product may be good science, but it probably will sit on a shelf to be read only by other scientists.
Data and the Accountable Public Servant
As public servants, people in government are accountable to the public - but accountable for what? Too often, a complex health or environmental problem has no easy solution. This situation poses an ethical dilemma. It may be responsive on the part of the government official to attend to the demands of the stakeholder even though the official is certain no solution will be reached, but is it responsible? By contrast, if the public servant turns down a study because the study won't solve the problem, does the public consider the public servant to be accountable or unresponsive? What should be the role of the public servant?
Funding studies of questionable value can place the scientist and the manager at odds. It can also place scientists at odds with each other. Good scientists want to advance science and, when asked to assess a broad public concern, will offer the best they have. Because the scientist has been brought in to address a problem, not to understand the issues, the most sophisticated tool is "the best" simply by default. Other scientists may be left questioning the choice of tool but hesitant to suggest that little will be gained except advancement of the tool. Scientists will readily identify the limitations of a study if they understand what the real issue and goal are. This type of preemptive thinking, responsible as it is, may be irrelevant when government managers are in the throes of responding to public and political pressure.
When a study concludes that more study is needed, the concerned are still concerned, but also upset because time and money have been wasted. The situation arises because the concerned public, the agency with the funds, and the scientists never got together to evaluate what the problem is and what the tools can accomplish. Acting out of good intentions just is not enough anymore. The most important effort of responsibility is needed up front.
The Public, the Problem, and the Scientist
Scientists usually are not involved in the decision-making process, have no clue what the real concern is, and are not asked if a successful study will provide a solution. Would wisdom not suggest a peek at the likelihood of finding an answer before committing millions of dollars of public funds to collect good data that are useless? We should expect the scientist to understand the stakeholders' concerns before trying to solve them, and we should ask scientists to explain the limitations of the tools to the stakeholders. Individually, neither the scientist, the manager, nor the stakeholder has the answer, but together they probably do.
Decision Analysis, the Value Solution
When good scientists, good government, and good intent lead to a study that results in good science of no utilitarian value, a bad decision has been made. The "bad" isn't the result of the study; it is the decision to undertake the study
Is there a solution to this dilemma? Yes, if one is willing to understand, accept, and combine a few basic principles of politics and science, divorce bureaucracy, and perhaps even incorporate another science to a greater or lesser degree. Making informed decisions up front can save years and millions of dollars from being wasted, and it can go a long way toward freeing the decision maker of criticism or blame. Most important, this kind of decision making assesses whether a solution can be found, how much of the problem can be addressed, and how best to do so.
Decision analysis is a science and a pivotal assessment tool, yet decision making is too often reduced to a concerned response - and sometimes to ego.
Whether formally or informally, decision analysis needs to come into play, It is the best way to put a problem into context with its genesis and solution. In analyzing a decision, dissecting its parts, and understanding what's needed to make a good decision, one is forced to think backwards and answer questions about motive, success, and value. The more the parts of a decision are dissected, the easier it becomes to quantify the value of those pieces. This process develops and clarifies the attributes of the decision and then asks the decision makers to weigh and compare the value of each attribute relative to the study Decision making can draw on the intelligence, views, and opinion of all those involved (e.g., managers, scientists, those concerned about the issue, other stakeholders). It is blind to politics, outcry, pressure, and emotion - or at least it will identify those components so that they can be weighed properly Looking into a kind of crystal ball at success or failure forces the decision maker to address and incorporate a full complement of facts. Decision analysis doesn't allow one to skip any issue out of fear or in the interests of convenience.
This decision-making tool has been around for a long time. Ten years ago, the author was responsible for the allocation of a $12 million federal budget slated to address environmental concerns relative to a new commercial industry (coal gasification). When a number of studies were proposed, whose cost totaled more than $75 million, he had a dilemma. True, he had the authority to select a research portfolio, but his goals were
* to get the most value for the taxpayer dollar,
* to be able to justify his decision, and
* to win acceptance of the portfolio, especially by those who were not funded.
A formal decision analysis framework was used to quantify the value of each study and rank it for funding. The proposer of each study went through a rigorous step-by-step evaluation. Value comparisons were made among all proposed studies with the assumption that each study would reach its goal; the likelihood of success was assessed as well. Each proposer was involved in the decision process. In the end, the proposers understood why their studies - and all others - were or were not funded. The final selection was never questioned because it was clear to all involved. Most important, the return on the research dollar investment was maximized (1).
Decision analysis is the opposite of uncertainty analysis because it seeks certainty in choice. Specificity and quantifying logic are maximized, and qualitative judgement and uncertainty are minimized. The process is nearly immune to the squeaky wheel and provides a logical, systematic, traceable, understandable, quantifiable, and defensible decision that is easily reiterated. It moves the decision makers (government agencies, industry, local organizations, and other stakeholders) from qualitative judging to value-focused thinking.
Decision analysis builds a framework for a thought process that charts the decision situation in detail. Objectives, attributes, and properties are developed and fine tuned. Then a model is built for quantifying values that are ordinarily thought to be unquantifiable. Hidden objectives also are uncovered. Most important, this process maps the decision equation so that it can be understood by all participants and explained to anyone. The process is described in detail in Ralph Keeney's Value-Focused Thinking (2).
The government finally is beginning to shed the old paradigm in which simply spending money to satisfy an outcry is considered responsive. Stakeholders are being recognized as positive forces and integral to the process of decision making. It's not enough, however, simply to involve stakeholders through a forum for addressing concerns. Stakeholder organizations are effective and meaningful when their contribution is quantifiable in terms of perceived value rather than in terms of the weight of an outcry. Formal decision analysts sorts this out, giving stakeholders an equal share in the decision process, as well as credit for the solution.
A decision always has a solid foundation when the government and the public join together and use the proper scientific tools to make value-focused decisions instead of exercising judgement under stress in response to outcry, amplitude, or politics. Does this mean the government should listen less or ignore the outcry? On the contrary, it means the government must listen more closely, understand in great depth what is being asked, and, when possible, bring the outcry into its team. The government must ignore politics and image, solicit the intellect of the stakeholders as well as of the scientists, and use only facts and science to arrive at decisions. In an unpatronizing way, issues, values, and science must be explained throughout the process - not only at the end.
One must accept, however, that when all is done, solving the problem may not always solve the controversy Belief can be far stronger than reason or proof. Decision analysis won't eliminate the dissatisfaction of opposition members who are blind to decision-making tools or are motivated only by their own agendas and opinions, but it will answer them with reason, values, and facts that others can understand.
Finally, it's time we as scientists, managers, and public servants forget what self-proclaimed experts tell us about needing to explain science at the sixth-grade level so that the public will understand. Even individuals with advanced degrees don't always understand complex sciences. The information should be comprehensible to everyone. This is called intelligibility in composition, not sixth-grade writing. Anything less is irresponsible. Furthermore, we must continue to give members of the public the credit they deserve, treat them as partners in the process, and use their intelligence. The public must be allowed to see the process and, if possible, to participate in it. If they cannot do so in a rational and professional manner, they should be excused and told why
The satchels of environmental health professionals are filled with wonderful scientific tools. We have a responsibility to understand those tools and use them wisely in consultation with the patient. We need to replace politics with science, emotion with facts, guesswork with decision analysis, and image with value; we need to continue moving toward trust. Perhaps we could even add a genuine smile.
Corresponding Author: Timothy W. Joseph, Ph.D., U.S. Department of Energy, Oak Ridge Operations Office, P.O. Box 2001, Oak Ridge, TN 37831.
Disclaimer: As a senior scientist employed by the federal government, the author is required to state that the views and opinions expressed herein are exclusively his and do not necessarily reflect the views and opinions of the U.S. government or any agency thereof.
1. Peerenboom, J.P., W.A. Buehring, and T.W. Joseph (1989), "Selecting a Portfolio of Environmental Programs for a Synthetic Fuels Facility," Operations Research, 37(5):689-699.
2. Keeney, R.L. (1992), Value-Focused Thinking, a Path to Creative Decisionmaking, Cambridge, Mass.: Harvard University Press.
|Printer friendly Cite/link Email Feedback|
|Author:||Joseph, Timothy W.|
|Publication:||Journal of Environmental Health|
|Date:||Oct 1, 1999|
|Previous Article:||Identifying Allergens: Which Present a Risk?|
|Next Article:||A once in a lifetime opportunity ... an invitation to participate in the NEHA/CIEH Exchange.|