Knowledge is power, right? I usually assume that learning more about science or technology is helpful when it comes time to form an opinion or determine a correct course of action, and I also assume that as more and more people learn about scientific or technical topics, the more they will tend to agree with each other about what should be done. But it doesn’t always work like that; in fact, in a recent study, the more people learned, the more they disagreed.
This article from Reason Online describes some research into people’s attitudes toward nanotechnology, published by the Yale Law School’s Cultural Cognition Project. After 1,850 people, most of whom knew little or nothing about nanotech, were given a scanty amount of information on the subject, most of them were willing to venture an opinion on its risk-benefit ratio. The authors of the study say the primary motivator for the opinion was how people felt about the issue: they were going with their gut, basically. And when a subset of the group was given a couple more paragraphs of information, they believed even more strongly in their original positions (e.g., if they thought nanotechnology was risky before getting the additional info, they thought it was even riskier afterward).
This looks like a striking and discouraging example of confirmation bias, the propensity for noting new information that confirms our existing beliefs and discarding new information that doesn’t. The Reason Online article also discusses some of the implications for science and technology policy. I’m curious about how we arrive at our gut reactions in the first place, and if there’s any way to learn to make more informed initial judgments.
So if I don’t already agree with that article, I guess it’s not worth my time to read it!
Alternatively, you might be able to find something in it to support your position no matter what that position is!