AgBioForum
Volume 4 // Number 3 & 4 // Article 4
PDF Comment on this article Issue contents Previous article Next article
Sending Messages Nobody Wants To Hear: A Primer In Risk Communication
Palfreman Film Group, Inc.
Using case studies from nuclear waste disposal to genetically modified organisms, this paper examines the art and science of risk communication. It argues that effective risk communication must not only take account of the psychology of how ordinary people perceive risks, but also allow for the actions of intermediaries—so-called Risk Amplifiers and Risk Minimizers.
Key words: risk communication; risk perception; Risk Amplifiers; Risk Minimizers; genetically modified organisms; GMOs; agricultural biotechnology; nuclear waste.
Imagine you are the president of a media company. Following proposals to build a high-level nuclear waste repository in the Nevada desert, the Department of Energy (DOE) commissions you to mount a campaign reassuring the local population that the new facility poses no danger. After conferring with a panel of engineers and scientists from the National Academy of Science, you become convinced that the facility will be extremely safe, with multiple physical barriers and dozens of warning sensors. The nearest people, it turns out, live 80 miles away across empty parched desert that, ironically, is littered with plutonium left over from countless nuclear weapons tests in the 1950s. The fact that this nuclear waste, lying largely unmonitored for 40 years, has not caused any fuss leads you to think that perhaps this community might welcome a highly engineered state of the art nuclear waste facility.

Your campaign does not go as expected. In public meeting after public meeting, you find the audience is hostile. Your efforts at reassurance only seem to increase people's anger and outrage. You begin to sense that the reason this community does not want the repository has less to do with the objective health risks than with psychology, politics, and human values like fairness. Since there are no nuclear plants in Nevada, Nevadans think it unfair that they should be forced to take nuclear waste created in other states. The proposed facility—a graveyard for extremely dangerous radioactive waste—seems to have no positive attributes likely to generate jobs and tax revenues. To the contrary, Las Vegans fear that the stigma of the repository may make Las Vegas less attractive to tourists. Worse, they keep talking about the fact that the waste will remain radioactive for tens of thousands of years, posing a danger to future generations.

Your employer, the DOE, tells you to reiterate the key message: the new repository will be safe, constituting the best "permanent" solution to this urgent national problem. You bring in more scientists—geologists, hydrologists, chemists—and ask them to state categorically that no radiation will escape from this repository and make it through the desert to Las Vegas, not even in 50,000 years. Now your experts become uncomfortable. An awful lot, they protest, can happen in 50,000 years: climate change, altered rainfall patterns, earthquakes. It turns out that a lot of scientific details—such as the speed water travels through certain kinds of rock—are not known very precisely. The experts point out that predicting the state of the repository 50,000 years into the future is probably futile, especially given there are few buildings anywhere in the world that have survived for more than 1,000 years.

Once confident of your facts, you find yourself on the defensive as critics ask you a series of increasingly tricky (and arcane) "What if?" questions. What if there is an earthquake, a terrorist attack, a war? What if civilization suffers a great catastrophe and records are lost? Thousands of years in the future, people stumbling on the facility may not understand 21st Century written English. How, critics ask, will the facility be marked to warn strangers of the dangerous material buried within?

If nothing else, your first foray into the world of risk communication has taught you a simple truth: experts and lay people can end up figuring risks quite differently. Just why should this be so?

Perceptions Of Risk

Social scientists have long been interested in the ways humans perceive risks (Cosmides & Tooby, 1996; Kahneman, Slovic & Tversky, 1982; Slovic, 2000; Slovic, 1997). Their basic theory goes like this. Over tens of thousands of years, human brains evolved short cut methods to quickly size up novel situations in order to avoid dangers and seek rewards. These "heuristics," while once useful for survival, may well prove to be a liability when trying to parse the risks and benefits concealed within modern scientific controversies.

Herein lies the beginning of an answer to why your communications about nuclear waste was so ineffective. Because human brains did not evolve to handle probabilistic reasoning, we are not very good at it. In fact, say these social scientists, our minds do not therefore rank risks by their statistical probability, but instead use the same risk calculus employed by our cave-dwelling ancestors.1 Here is a summary of what the research finds again and again.

Humans find highly improbable risks with a potential for a cataclysmic disaster (e.g., nuclear accidents, child abduction, plane crashes) much scarier than every day risks with a high probability (e.g., car accidents). We are also much more concerned about risks that are imposed (e.g., nuclear energy, chemical pesticides,) than risks that we enter into voluntarily (e.g., driving, smoking, obesity, bungee jumping). And we feel better if benefits (jobs, community tax revenue, money, opportunity) accompany the risk than when no benefits accrue. The public tends to be much less scared if they trust the regulatory agencies involved rather than mistrust them. An especially powerful variable (one that can turn a controversy on its head) is need. If a particular commodity or service is needed then people are much more "risk tolerant." United States approval ratings for nuclear power changed from 42% in October 1999 to 66% today (Nuclear Energy Institute, 2001), largely, experts believe, due to the energy crunch in some western states. Finally, new, transformative, hyped technologies are much more likely to cause public anxiety than established familiar technologies which have been around for centuries—even if those old industries/practices are inherently more dangerous.

As a television producer who needs his audience to pay attention to the salient details of a complex policy debate—such as, for example, the pros and cons of genetically modified organisms—I have come to respect these psychological tendencies. Since these risk determinants are ingrained in all of us, honest risk communicators everywhere have no choice but to deal with them head-on.

Media Matters

But it gets worse. Risk communicators trying to impart sensible mainstream scientific knowledge typically must compete with other skilled messengers. For once a new "risk" has been identified, groups who have a stake in the controversy may act to amplify or minimize the risk. Interested corporations typically try to play down the risk, while advocacy groups may use their considerable communication skills to try to attract the media, hoping to get a share of the public's attention for their pet issue. These "Risk Amplifiers" know that what attracts journalists and television producers are good "stories" with clearly defined characters, plenty of conflict and emotion, and strong moral issues capable of some resolution.

Despite what is said about corporate influence, at this stage the Risk Amplifiers have a distinct advantage over the "Risk Minimizers." There is a well-documented tendency, termed "reporting bias," whereby investigators report positive risk associations more readily than findings of no risk associations. This is made worse by "publication bias," the preference of journal editors for positive risk association reports over null findings.2 Once the popular media have become interested in these reports, the perceived risk may be further enhanced by two other effects (Gilovich, 1991). Because journalists are under pressure to tell a story in as simple and dramatic way as possible, they indulge in "sharpening" (using weasel words like "may" and "could" to exaggerate the strength of a finding) and "leveling" (playing down or ignoring the caveats in a report). And superimposed on everything is the "risk ratchet." Hank Jenkins-Smith and colleagues have demonstrated that "good news" and "bad news" do not affect people to the same degree (Jenkins-Smith & Silva, 1998). A report that a risk may be "slightly greater than previously thought" is more effective at increasing a risk perception than a report claiming "a risk is slightly less than expected" is at reducing a perception of risk.

But the Amplifiers must be careful not to overstate their case. While characterizing an alleged risk in dramatic and specific terms gets public attention, there is a price to pay if your story does not hold up. The early opponents of silicone breast implants, for example, held that the implants caused autoimmune diseases such as rheumatoid arthritis. When this hypothesis was tested in a series of major studies and found to be incorrect, some plaintiff attorneys and physicians (the main silicone Risk Amplifiers) tried to claim that the implants caused peripheral neurological complaints instead, but it was too late to make the change and keep the attention of the public (Angell, 1997; Palfreman, 1996).

Lessons For Agricultural Biotechnology

It is time to apply some of this knowledge. Imagine you are back running your media company. Following reports from Europe that consumers are demonstrating against GMOs, a biotech industry organization asks you to come up with a good risk communication strategy in the US. Realizing just how complex a task this is, you decide to study the field extensively.

There appear to be some good signs. First, there is little scientific controversy. To the contrary: virtually all agricultural scientists hold that GM technology is powerful, exciting, and at least as safe as traditional agricultural techniques. Second, people seem to have great confidence and trust in the relevant regulatory agencies—the US Department of Agriculture, the Food and Drug Administration, and the Environmental Protection Agency. Third, there are actually some environmental and ethical arguments in favor of these crops: from using fewer pesticides to potentially helping farmers in the developing world to increase yields. Another good sign is that it is hard to think of a GMO-based catastrophic incident like Chernobyl. The main downsides that Greenpeace and other environmental groups are talking about are gene flow and the emergence of resistant pests, arguments that can also be leveled at traditional agriculture.

But you see a few warning signs. Genetically modified organisms have been imposed on American consumers, so any risk will be perceived as involuntary. And, in a country that produces a massive food surplus, it is hard to see a compelling needfor GMOs. Also, you see an apparent contradiction in the "narrative" being communicated by agricultural scientists and biotech companies (the Risk Minimizers). On the one hand, they are trumpeting the new cutting edge nature of the technology. But when pressed about risks, the same people stress that agricultural biotechnology is simply a continuation of traditional cross breeding methods used for millennia.

So before going forward you decide this time to set up some focus groups and commission some quantitative opinion research. And you immediately learn something astonishing. When you ask people if they have ever eaten any GM food, most say no. A majority of Americans are very surprised to learn they have been eating GM food—notably foods with GM corn and soy ingredients—for five or six years. And when they find out, they get angry, asking such questions as "Why weren't we told?" and "Why isn't the food labeled?" (Palfreman, 2001).

In the focus group discussions, some individuals start to connect GMOs to recent news reports. A story in which some GM pollen appears to have killed monarch caterpillars appears to concern people, as does a story in which a variety of GM corn approved only for animal consumption has ended up in human food products such as taco shells. Opinion research also reveals that there is a potentially dangerous "yuk" factor lurking in the background. When told that scientists can engineer genes from different species into plants, most people find the concept disquieting. You fear that the image of a fish gene in a tomato will provide powerful ammunition to the Risk Amplifiers, and note that in Europe Greenpeace has cleverly dubbed GMOs "Frankenfood."

You conclude that things are perilous. Any adverse reports about GMOs are likely to be reported, published, and picked up by the popular media, and any perceived risks amplified and ratcheted up. So, what is the best risk communication strategy? You cannot do much about the fact that GMOs are not perceived as necessary. You also feel it might be dangerous for biotech companies to make too much of the fact that GMOs will feed the world, given that most people realize they are, like all private companies, trying to make profits. But one variable appears very promising: choice. The ideal strategy is staring you in the face: labeling!

Risk Amplifiers like Greenpeace had lobbied hard for GM food to have labels, hoping it would drive consumers away, and Risk Minimizers like Monsanto had lobbied hard against labels for the same reason. You suddenly suspect that both sides may have it completely wrong. You instruct the opinion researchers to go back and ask people how the introduction of mandatory food labels would affect their attitudes to GMOs. Would having mandatory labels, which enabled people to choose whether to eat GMOs or to avoid them, increase or decrease their concerns about GMOs? The results are striking. Far from stigmatizing the food, labels eliminate much of the concern American consumers have about GMOs. As Jenkins-Smith says, "The single thing that leads people to support GMOs is the labeling question. If labels were there and people could choose, you get a massive majority in favor of GMOs" (Jenkins-Smith et al., 2001; Palfreman, 2001). And the theory of risk perception predicts exactly such an effect, because making labels mandatory changes an involuntary risk into a voluntary one. By giving people the ability to choose whether they want to eat GMOs, much of the outrage is removed.

You hand in your report, recommending your employer to lobby the FDA intensively to introduce mandatory labeling. Unfortunately, your advice is too counter-intuitive for your client. The biotech organization fires you and retains another media company instead.

Welcome to the thankless world of risk communication!

Endnotes

1 To be fair, there is some dispute over the precise nature of the perceptual bias. Kahneman, Slovic and Tversky's (1982) view has been challenged by Cosmides and Tooby (1996) who argue that the "perceptual flaw" is an artifact of forcing subjects to work in Baysian analysis. When permitted to use a "frequentist" approach, they argue, people do much better at statistical inference puzzles.

2 Some scientists have even argued for setting up a "Journal of the Null Result" to compensate for the bias in favor of publishing positive findings. After all, approximately 5% of the time, random chance will indicate statistically significant correlations between unrelated variables.

References

Angell, M. (1997). Science on trial: The Clash of medical evidence and law in the breast implant case. New York: W.W. Norton.

Cosmides, L. and Tooby, J. (1996). Are humans good intuitive statisticians after all? Rethinking some conclusions from the literature on judgment under uncertainty. Cognition, 58, 1-73.

Gilovich, T. (1991). How we know what isn't so: The Fallibility of human reason in everyday life. New York: Free Press.

Jenkins-Smith, H. and Silva, C. (1998). The role of risk perception and technical information in scientific debates over nuclear waste storage. Reliability Engineering and System Safety, 59, 107-122.

Jenkins-Smith, H., Silva, C., Mitchell, N., Whitten, G., & Vedlitz, A. (2001). Fields of dreams or nightmares? Public reactions to GMOs. College Station, TX: Texas A&M University, George H.W. Bush School of Government and Public Affairs Institute for Science, Technology and Public Policy.

Kahneman, D., Slovic, P., & Tversky, A. (1982). Judgment under uncertainty. Cambridge, UK: Cambridge University Press.

Nuclear Energy Institute. (2001, April). Perspectives on public opinion.

Palfreman, J. (2001). Harvest of fear [Television series episode]. Nova/Frontline. Boston: WGBH TV.

Palfreman, J. (1996). Breast implants on trial [Television series episode]. Frontline. Boston: WGBH TV.

Slovic, P. (2000). The perception of risk. London: Earthscan.

Slovic, P. (1997). Trust, emotion, sex, politics and science: Surveying the risk-assessment battlefield. In M. Bazerman, D. Messick, A. Tenbrunsel and K.Wade-Benzoni (Eds.), Psychological Perspectives to Environment and Ethics in Management. San Francisco: Jossey-Bass.


Suggested citation: Palfreman, J. (2001). Sending messages nobody wants to hear: A Primer in risk communication. AgBioForum, 4(3&4), 173-178. Available on the World Wide Web: http://www.agbioforum.org.
© 2001 AgBioForum | Design and support provided by Express Academic Services | Contact ABF: editor@agbioforum.org