Why facts don’t change minds

The secret to successful risk communications

When it comes to public opinion, there’s a common belief among companies and their communication teams that providing facts is the best way to sell your side of the story. The more information, the more likely the public will be on side, right? Wrong.

Look at high-profile examples across B.C. such as Kinder Morgan’s Trans Mountain pipeline expansion project or Enbridge’s Northern Gateway pipeline project. Despite an overwhelming amount of information provided by these companies and their armies of PR people, these proposed developments have become textbook examples of how not to try to achieve social licence to operate.

Despite these companies’ assurances their projects will create jobs, are safe and respect the environment, the public continues to see the benefits as small and the risks unacceptably high. In fact, the B.C. government recently said it couldn’t support Kinder Morgan’s pipeline expansion project because the company isn’t offering enough details around how it would respond to a potential spill.

It’s not that information doesn’t matter. Society is structured around the use of information to back arguments and make decisions. However, a growing body of research on how people develop perceptions of risk shows that information alone does not change people’s fears and concerns about what is risky. Emotions play a huge part.

According to University of Oregon psychologist Paul Slovic, who has studied the various social and cultural factors that lead to disputes and disagreements about risk, the problem lies in the diverse views between how “experts” and the public view risk.

Experts look at risk as a calculation of probability and consequence. It’s about numbers. The public takes a more personal approach; their perceptions are around personal control, voluntariness, children and future generations, trust, equity, benefits and consequences.

Slovic says the mistake the experts (and the companies they represent) make is viewing themselves as objective and the public as subjective. They perceive the public as being too emotional and having irrational fear. The public’s attitude is then dismissed as laypeople that get the fact wrong and don’t understand the evidence.

“Laypeople sometimes lack certain information about hazards,” Slovic says. “However their basic conceptualization of risk is much richer than that of experts and reflects legitimate concerns that are typically omitted from expert risk assessments.”

This is where a company’s decision to “educate” the public to take its point of view can really backfire. People aren’t sitting around waiting to be told what to think. In fact, few of us like being told what to think.

Risk communicators need to be sensitive to this broader concept of risk. Facts aren’t just facts. They aren’t as objective as we assume they are. Facts and risk are subjective for both experts and the public. They are a blend of values, biases and ideology. The hypodermic needle theory of communication, where we administer the facts to cure people of their misunderstanding, doesn’t work.

James Hoggan is a public relations consultant. His latest book, I’m Right and You’re an Idiot: The Toxic State of Public Discourse and How to Clean it Up, will be published in May