View Single Post
  #11  
Old 10-18-2007, 02:47 PM
Siegmund Siegmund is offline
Senior Member
 
Join Date: Feb 2005
Posts: 1,850
Default Re: Why do people cite surveys?

Speaking as a practicing statistician, who has done some amount of survey design and analysis even though it's not the bulk of my work...

I blame this perception of surveys as unreliable on two things, both tied to sloppy word usage (though I am not blaming the entire problem on journalists):

1) People tend to use the same names both for uncontrolled whoever-wants-to-answer-answers or whoever-is-convenient-to-ask-gets-asked lists of questions, and for carefully administered surveys. The sampling method IS critical to the validity of the results, and all the serious pollsters and most of the casual ones are aware of this - but...

2) When surveys are in the news, the "highlights" - the most significant or surprising final results, usually - are all that gets reported. The actual published survey report is going to include literally pages of fine print; what the intended target population was, how many were selected, by what method, how nonresponse was coped with, how "I don't know" answers were coped with, and so on and on and on.

It's REALLY easy for people to accidentally (or deceptively) leap from "75% of a sample of people 18 to 45 in major east and west coast cities" to "75% of Americans." Or "75% of attendees at the American Medical Association conference" to "75% of doctors".

If you can see the report produced by the actual surveying agency, or even the press release they prepared, and you read it with a careful eye to word choice, it's usually quite easy to tell good from bad surveys apart. But the necessary information was never in the bad surveys that make it into the news, and is removed from the news reports of the good surveys that make it into the news, leaving the public (such members of the public who care) with no way to tell.
Reply With Quote