X-Message-Number: 30716
From: David Stodolsky <>
Subject: Nanotechnology risk perceptions
Date: Sat, 26 Apr 2008 11:14:42 +0200

http://www.abc.net.au/rn/scienceshow/stories/2008/2176777.htm#

And so back to that AAAS press conference and those public attitudes  
and how they're influenced.

Dan Kahan: I'm Dan Kahan, I'm a professor at Yale Law School, I'm  
affiliated with the Cultural Cognition Project which is a research  
team of professors from different universities. We receive our funding  
to look at nanotechnology risk perceptions from the National Science  
Foundation, and also the Project on Emerging Nanotechnologies at the  
Woodrow Wilson International Center for Scholars. We've been doing  
research for a little over a year now and are continuing to do so, but  
I'll just tell you about three of the findings that we have made so far.

The first is that although most people don't know very much about  
nanotechnology, they're still pretty opinionated about it. One of our  
surveys of a large diverse national sample showed that about 80% of  
the population in the United States either knows nothing or very  
little about nanotechnology. But 90% of those people who said they  
didn't know very much if anything about it still had an opinion on  
whether the risks would outweigh the benefits. What we found is that  
that very quick visceral reaction was driven a lot by emotions. So  
just the term 'nanotechnology' or even a very brief description of it  
can give somebody an initial sense of whether it's risky or beneficial.

The second finding is that as people start to learn about  
nanotechnology they don't form a uniform opinion. In fact they become  
culturally polarised. There's a body of research and cultural  
cognition is the mechanism that describes the phenomenon that shows  
that people tend to conform their beliefs about risks to their values.  
So if you're somebody who likes commerce and industry and private  
initiative you tend to be very sceptical about environmental risks. If  
you're somebody who believes that commerce and industry does bad  
things and creates inequality, you'll embrace findings of risk. We  
found that people who have values like that, when they're exposed to  
even just a little bit of information about nanotechnology they divide  
along those lines. So people who don't know anything initially will  
form beliefs that fit their cultural predispositions on environmental  
risk generally.

Then the third and final finding, which may be as kind of a gloss on  
something Elizabeth said, is that people do defer to experts on  
nanotechnology, but we find that the experts they defer to are the  
ones who they perceive have the same cultural values that they do. We  
did an experiment where we created fictional experts, and we found  
that people, just by looking at them and by reading a mock CV, would  
impute to them values just about how society should be organised. Then  
we assigned to those advocates positions on nanotechnology just  
randomly-suspend it pending more research on risk, allow it to  
continue pending more research on risks-and we then saw how people  
reacted to the arguments of these fictional experts. It turned out  
that people would adopt whatever argument on nanotechnology was being  
advanced by the experts whose values were closest to theirs. So if you  
had an expert whose values you shared, who was presenting the argument  
that fit your cultural predisposition, you became even more extreme,  
whereas if that same expert, the person you identified with, said  
'don't worry about it' and you were inclined to worry about it, now  
you think there's no problem whatsoever.

We even were able to create conditions where people couldn't find any  
association among the experts, and then cultural polarisation as I'm  
describing it disappeared. So the take-home (message) is that you  
shouldn't just assume that people are going to form beliefs about  
nanotechnology that match the best scientific understandings out  
there. In the normal course they're going to form beliefs that fit  
their cultural predispositions but then don't assume that's  
inevitable. In fact it is possible to devise communication techniques  
that can help to counteract that bias. That would be a good thing  
since everybody, regardless of their values, presumably wants to make  
a decision on the basis of the best available information we have  
about the risks and benefits of nanotechnology.





David Stodolsky    Skype: davidstodolsky

Rate This Message: http://www.cryonet.org/cgi-bin/rate.cgi?msg=30716