|Home About Us Our Authors Us In The Media Links to Data Our Values Contact Us|
Addressing the “Balanced Coverage” Issue in the Media
The tactics and techniques for manufacturing doubt in the face of a scientific consensus were perfected by major tobacco companies during the 1950’s and 60’s, in their efforts to discredit cancer researchers’ burgeoning evidence of the link between smoking and lung cancer. In his 1995 book “Cancer Wars,” Robert Proctor documented the influences of professional, economic, and political interest groups on American governmental priorities and funding of cancer research. An infamous 1969 memo from one corporate executive declared that “Doubt is our product since it is the best means of competing with the ‘body of fact’ that exists in the mind of the general public. It is also the means of establishing a controversy.”
David Michaels’ 2005 article in “Scientific American” on the manufacture of uncertainty and later, his 2008 book, followed Proctor’s lead. He identified three primary messages orchestrated by the tobacco industry to challenge the scientific consensus linking smoking with lung cancer: (1) Cause-effect relationships have not been established, (2) Statistical analyses are inconclusive, and (3) More research is needed. This industry hired its own scientists, founded its own research publication (“Tobacco and Health Research”), and carefully orchestrated a media campaign to spread their messages. Since then, Naomi Oreskes and Erik Conway’s 2010 book on similar themes appeared, updated to include accounts of how doubts were manufactured concerning climate change and global warming in particular by organizations employing tactics inspired by the tobacco industry’s example. I won’t go into the details of doubt-inducing tactics here; the sources I’ve just mentioned do an excellent job on that topic. Instead, I want to raise two issues that supplement those covered by those sources.
First, I should point out that uncertainty has its uses regardless of one’s political stripe. Indeed, doubts can serve both sides of a scientific controversy simultaneously, albeit for different purposes. Some fifteen years before Proctor’s book, I wrote an account (Smithson, 1980) of how both environmentalists and industrialists used initial uncertainties about the effects of CFCs on the ozone layer to bolster their agendas. Each side had seized on one of the two favorite responses to profound uncertainty. The environmentalists’ position was a precursor to the precautionary principle: Ban CFCs until it can be proven that they are not harmful. The industrialists’ argument reflected a well-known status-quo bias: Allow CFC production and marketing until they are proven harmful. Also, as we shall see, the mainstream media has uses for uncertainty, especially if it can be framed as controversy or conflict.
Second, Machiavellian scheming and normative scholarly skepticism are not the only producers of doubt. Doubt also can be an unintended byproduct of debate or balanced coverage of an issue. Journalists have been taken to task recently for giving “equal” time to global warming disbelievers, on grounds that the scientific consensus is so strong that lending credibility to disbelievers does the public a disservice.
The Australian media treatment of Ian Plimer’s 2009 book, “Heaven and Earth,” is a case in point. Plimer’s book was published just prior to the debate on the Emissions Trading Scheme (ETS) legislation in the Australian House of Representatives (June) and the Senate (August) in 2009. Despite the book being discredited by several of Australia’s top climate scientists, several newspapers published favorable editorials and opinion pieces about it, portraying it as a telling counter-argument against the scientific consensus on climate change.
Instead of being outraged about such occurrences, understanding the motivations and payoffs behind such practices may provide clues about how they might be reformed. Holly Stocking and Lisa Holstein’s 2009 paper presented a case study of the media coverage of a controversy following the rapid growth of industrial hog production in North Carolina during the 1908’s and 1990’s. Stocking and Holstein are former science journalists who became academics. Their chief interest was journalists’ responses to various attempts by the North Carolina Pork Council to discredit and discourage a University of North Carolina public health scientist’s research regarding health and environmental problems arising from hog production.
Stocking and Holstein began with the claim that “…claims-makers who offer contrary views, however outrageous, often are quoted in news stories because their inclusion reinforces the impression of journalistic objectivity, a hallowed ideal and a defining norm of journalists’ professional values.” (pg. 28). A byproduct of this even-handed exposure of views is increased (and perhaps unwarranted) public doubts about views that nonetheless are backed by considerable evidence and expert authority. One of their central claims was that often the combatants are aware of this norm and try to exploit it. A related point is that the scientists’ norm of openly admitting limitations and uncertainties pertaining to their research findings can be a disadvantage when less scrupulous opponents magnify those caveats in order to discredit the research or the scientists themselves.
Stocking and Holstein related four kinds of journalistic attitudinal clusters to the ways in which journalists treat conflicting views in scientific controversies.
The Disseminator and Adversarial roles are the most likely to raise doubts, but they do so in different ways. The Disseminator’s pursuit of even-handedness can lend weight to views that in other forums would be completely discredited. Stocking and Holstein’s examples of this approach included a reporter who “believed it was his obligation to publish the views of all parties to the hog research controversy, including the pork industry’s ‘pseudo-science’ label [of the UNC researcher’s studies] and its charges that the University of North Carolina had an ‘anti-farm bias.’” (pg. 32) The Adversarial journalist, on the other hand, is more likely to raise moral doubts (e.g., are the scientists truly impartial about the evidence? Do they have vested interests of their own?). Stocking and Holstein’s example here was an article that “framed UNC’s School of Public Health as a tax-supported institution that was taking an ‘activist stance’ with varied ‘anti-hog’ activities in research and educational programs alike.” (pg. 35)
Journalist Colin Schutz’s blog in August 2010, “Tips for young science journalists: A crash course on the major issues in the field,” echoes the Stocking-Holstein claim regarding a widespread norm among journalists to give every side to an issue airing. He presents this as an example of a “frame” for a story. But his rationale isn’t objectivity or even impartiality. It’s attracting the readers: “The most common frame by far in journalism is conflict. Here is a ‘good’ guy. Here is a ‘bad’ guy. The journalist might play up whatever opposition there is between them. Setting up some conflict gets the reader to associate with the people involved, bringing them into a debate to which they may otherwise pay no attention.” In short, controversy and, by implication, doubt, sells stories.
There are at least two ways scientists might work more effectively with mainstream media. One is to be selective about which outlets and journalists they work with and/or endorse (e.g., avoiding those committed to the Disseminator or Adversarial models). Another is to alert and educate journalists about the downside of controversy-mongering. For instance, presenting conflicting views from two apparently equally authoritative sources may sell stories, but it also decreases credibility and trust in both sources (Smithson, 1999). Erosion of public trust is a major contemporary issue for scientists and governments, so there are grounds for scientists and policy makers to collaboratively militate against misguided media practices.
A third possibility, one that increasing numbers of scientists and scholars have invested in, is using or creating alternative media (mainly those spawned by the internet). Can the newer media do better? It may be too early to tell. Unregulated forums probably won’t, because they will allow all comers and may thereby fall prey to the indiscriminant “balance” problem. Regulated forums might, especially if their contributions come from domain experts. However, they may suffer from preaching to the converted unless their ambit is sufficiently inclusive. The greater interactivity of the new media and the emergence of appropriately regulated but fairly inclusive forums seem to hold the greatest promise of enabling genuine controversies to be debated and false controversies to be put to rest.
An earlier version of this article was posted on BestThinking on the 27th of October 2010.
Michaels, D. (2005). Doubt is their product. Scientific American, 292 (6), 96-112.
Michaels, D. (2008). Doubt is their product: How industry’s assault on science threatens your health. New York: Oxford University Press.
Oreskes, N. and Conway, E. M. (2010). Merchants of doubt: How a handful of scientists obscured the truth from tobacco smoke to global warming. New York: Bloomsbury.
Plimer, I. (2009). Heaven and Earth: Global warming—The missing science.. Lanham, MD: Taylor Trade Publishing.
Proctor, R.N. (1995). Cancer wars: How politics shapes what we know and don't know about cancer. New York: Basic Books.
Schultz, C. (2010) http://colinschultz.wordpress.com/2010/08/03/tips-for-young-science-journalists-a-crash-course-on-the-major-issues-in-the-field/. Accessed 16 May 2011.
Smithson, M. (1980). Interests and the growth of uncertainty. Journal for the Theory of Social Behavior, 10: 157-168.
Smithson, M. (1999) Conflict aversion: preference for ambiguity vs. conflict in sources and evidence. Organizational Behavior and Human Decision Processes, 79: 179-198.
Stocking, H. and Holstein, L. (2009) Manufacturing doubt: journalists' roles and the construction of ignorance in a scientific controversy. Public Understanding of Science, 18: 23-42.
Comments 1 to 21:
Post a Comment
You need to be logged in to post a comment. Login via the left margin or register a new account.