AgBioForum
Volume 1 // Number 1 // Article 4
PDF Comment on this article Issue contents Previous article Next article
The Emotional Response To Risks: Inevitable But Not Unmanageable
Stanford University
People are afraid of technology, no question about it. Some apprehension about new gadgets or products is just fear of the unknown that can easily be overcome with a little experience. But the "emotional dimension" of concerns about technology's potential risk and threats to public health or the environment is less readily addressed and can have a profound impact on consumers' acceptance of new technology. The new biotechnology—the use of precise, state-of-the-art molecular techniques for genetically improving varieties of microorganisms, plants, and animals—is a case in point.

As the government makes decisions about consumer products, fear and intimidation from several possible sources may distort the accurate assessment of risks, benefits, and possible alternatives. This can lead to decisions that are harmful from both an economic and humanitarian perspective. Understanding the emotional dimension can help health and food professionals and scientists to address largely emotional responses by the public and enable them to make more clear-headed decisions free from cynical manipulation.

The Emotional Dimension of Risk Perception and Response

Several factors that can cloud thinking about risks and have been prominent in various controversies about biotechnology include:

Uncertainty and ambiguity. Studies of risk perception have shown that people tend to overestimate risks that are unfamiliar, hard to understand, invisible, involuntary, and/or potentially catastrophic—and vice versa. Thus, they tend to underestimate risks that are relatively clear and comprehensible in their nature, such as using a chain saw or riding a motorcycle. But they overestimate invisible "threats" such as electromagnetic radiation or trace amounts of pesticides in foods, which inspire uncertainty and fear. Contributing to these emotions may be poor scientific literacy in general and unfamiliarity with the statistical aspects of risk in particular. For example, exactly what does it imply for an individual if we learn that eating a high fat diet increases the probability of bowel cancer by 17%?

In the case of the new biotechnology, several factors are at work. First, there is sparse knowledge of the long, safe history of conventional "biotechnology," or "genetic engineering," to produce vaccines, enzymes, and antibiotics, as well as virtually all of our domesticated crops. In fact, unless you're restricted to a diet limited to wild berries, game, fish, and shellfish, it is almost impossible to get through a day without eating food that has been genetically engineered.

Second, where genetic engineering moves genes between organisms, there seems to be a fear of disturbing evolutionary sanctity or the natural order of things. Moreover, many people do not understand the concept of alternative risks; for example, although there are theoretical risks of using biocontrol agents to eliminate plant pests, there are real and nontrivial risks of not using them—namely, the need to rely on chemical pesticides or to endure vast losses of crops.

Information overload. At best, nonexperts are likely to understand only a limited number of aspects of a risk analysis problem, and these people are easily overloaded with data. Information overload of the public is a strategy often used by those who would elicit fear about or disparage new technology. In one short peroration on biotech-derived foods, for example, an antitechnology activist might address: the consumer's "right to know" via product labeling, the "vegetarian issue" of fish genes introduced into tomatoes, the safety and socioeconomic issues of bovine growth hormone, and the alleged dangers of herbicide-resistant plants.

Antibiotechnology activists deluge the public with irrelevant, untrue, or (still more pernicious) partly true information that leaves the nonexpert bewildered, and this can lead to snap decisions and poor judgment.

Splitting and projection. A common response to fear and uncertainty is to split those involved in controversy into opposite camps—us vs. them—and to project onto them culpability and iniquitous intentions. Psychologically, this is an attempt to reduce anxiety and to reimpose certainty and clarity. These defense mechanisms may be activated especially easily when the "enemy" is painted as faceless, profit-hungry, amoral, multinational companies that will benefit handsomely from the sale of products. But such mechanisms are unproductive, because they polarize thinking and actually distort sound decision-making.

The wish for a return to a childlike world of purity and innocence. This romantic, puerile view of the physical world, reflecting a wish to escape from complex realities and choices, can give rise to a kind of puritanical, antitechnological view of the world. Purity and simplicity become desired ends in themselves, to the exclusion of other goals such as feeding and sheltering the inhabitants of the planet.

Manipulation of environmental anxieties. The hidden agenda of many of those who have attempted the "greening" of American society and government—environmental organizations, political leaders, and the media—is their own self-interest. But a by-product of their misinformation campaigns is progressively more widespread acceptance of junk science. Clouding the public's understanding of the development of new, biotechnology-derived varieties of crop plants, certain environmental organizations and the media have raised misinformation to an art form. The New York Times even coined the term "Frankenfood." What has been lost is the ability to discriminate between plausibility and reality.

Conclusions

What are the take-home lessons, then, for health and food professionals and scientists who need to communicate the risks and benefits of new products?

First, while emotional responses to questions of technological risk may be inevitable, they can and should be tempered with knowledge.

Second, that knowledge needs to be imparted in a way that is scrupulously honest but also simple enough to be understood. Concrete examples, especially relevant historical analogies, are often useful.

Third, in both public forums and (especially) as advisors to government, experts should insist on the inextricable linkage between science and public policy. At every opportunity, they should reinforce the importance of science and the scientific method—for science is organized knowledge, and knowledge is power.

Fourth, there has been far too much tolerance of outright misrepresentation and mendacity on what are fundamentally scientific dialogues. Far too often, government policymakers have welcomed anti-technology activists to their advisory committees, hearings, conferences, and bosoms. Biologist Donald Kennedy, former FDA Commissioner and former Stanford University president, chides policymakers: "Frequently decision-makers give up the difficult task of finding out where the weight of scientific opinion lies, and instead attach equal value to each side in an effort to approximate fairness. In this way extraordinary opinions, even those like Mr. [Jeremy] Rifkin's, are promoted to a form of respectability that approaches equal status."

But Kennedy is too charitable. In the arena of biotech policymaking, the policymakers have used the high-profile demands of anti-science groups to make extreme regulatory nostrums seem justified.

Although freedom of expression and vigorous debate are conducive to science and science policymaking, we must distinguish science from pseudoscience. Organizers of NIH-sponsored conferences on genetics do not, after all, invite creationists; and applied-physics meetings do not include sessions on the newest designs for perpetual-motion machines. There are well-intentioned members of the academic, government, industrial, and nonprofit communities who would attempt rational public dialogue with anti-biotech activists. I advise against it: the hidden agenda of some activists is to expand the domain of some people's political will over others', and the agenda of antibiotech activists' is often not to convey accurate, useful information about technological risk, but to dictate what scientific research may be done, how it may be done, and which types of products may be produced and marketed.


Suggested citation: Miller, Henry. (1998). The emotional response to risks: Inevitable but not unmanageable. AgBioForum, 1(1), 14-16. Available on the World Wide Web: http://www.agbioforum.org.

© 1998 Henry Miller | Design and support provided by Express Academic Services | Contact ABF: editor@agbioforum.org