Home   Blogging topics   Blog Talk   PR Blog Surveys Abound :: As of yet, only one makes the grade

PR Blog Surveys Abound :: As of yet, only one makes the grade

The one survey that makes the grade is by a Northwestern University Masters student. He got a Masters degree. The PR corporations might read and use his work as a guide for their future efforts. Read on…

PR News and Vocus recently teamed up for a little lead generation survey. I’m guessing it went out to thousands of people from the PR News email address database.

That’s fine. A tranparent effort (with the Vocus name at the top of all choices – the rest in alphabetical order). But, with Vocus advertising on the PR News site I’m guessing PR News shared their database, so all are happy. And, it is – after all – a lead generation effort to help Vocus take business away from their competitors.

But, the other recent surveys like this one from Edelman and Technorati and the Guidewire Group/iUpload surveys are another story.

I waited for better information to be posted by Edelman/Technorati and Guidewire/iUpload, but haven’t seen any. So, here goes.

Constantin Basturea has written about them: Guidewire/iUpload, response from Guidewire and Edelman and Technorati and one of Constantin’s follow-ups.

Thus far, Edelman has only offered this poor methodology statement:

Methodology:
Technorati is the leading Blog Analyst Firm. It invited over 30,000 subscribers to its electronic newsletter to participate in the study. Bloggers were also encouraged to take the study through blog posts on Technorati’s website. The online study was open for one week; 18 questions; 3 open-ended. There were 821 respondents. There is a +/- 3% margin of error.

What they do not say is (a) whether or not they are able to determine which respondents came from the email notification and which came from random visitors and (b) what percentage of each are represented in their reported results. Only then may we begin to determine a level of validity.

The “+/- 3% margin of error” has no legitimacy here without more information. And, will you please note that it only refers to the possible validity of the same survey completed by the same respondent pool. Since there are some/many non-identifiable respondents from random surfers, the survey cannot be repeated and therefore we have no way to even find out if 2nd and 3rd survey attempts would glean similar results.

Edelman/Technorati did have a large respondent pool. If we can comb out the email respondents from the random surfers we might at least be able to determine what some Technorati users think about the subject of the survey. Another problem is the broad generalizations made in the application of the results to the blog world at large.

Still, Edelman/Technorati is way ahead of Guidewire/iUpload. Some of their reporting of methodology was better than Edelman/Technorati. But, their respondent pool of only 140 from only CMOs makes the survey – at best – a glorified test run only good for preparing a better survey down the road.

I won’t even bother to go into their survey and the outrageously bogus claims they made from a respondent pool of 140 out of an emailing to 5,000. They should be ashamed and apologize publicly for those claims. Also, the blogs, sites and MSM outlets that furthered their claims without any research of their own (and specifically stating the response rate) should also be equally ashamed.

You can’t attempt a survey. Get it wrong. Then apply the results to the world at large. It is just crazy. Um, like this one: “The vast majority of companies (89%) are either blogging now or planning to blog soon.” Drop the koolaid, people. Don’t force it on others, either.

Frankly, Guidewire/iUpload shouldn’t have even released those results. But, as you might guess, they were using it to hype BlogOn. That press release, when you know the truth about their survey, is embarrassing. Damn the legitimacy. Full speed ahead on hype. Hardly the poster children for sound and transparent research or PR activities. Another press release that should never have been released.

Students doing Masters level research have done much better jobs of crafting their surveys and reporting the results. Check out Dan Li’s efforts at Northwestern University and read his results. You may even read the Table of Contents to his report. Now that is a survey methodology report.

Cameron A. Marlow at MIT has also published and presented some work. We are still waiting to learn the results of his Blogdex survey, his Doctoral dissertation project. Most of his sites are down now, though. The project may have been dropped.