Friday, March 03, 2006

Sampling and response data for MSDN blogs-and-public-relations study

First off, a big thank-you to anyone and everyone who took the time to respond to the survey on blogs and relational outcomes. This study is a follow-up to the one just published in JCMC.

Here's how Drew (a grad student I work with here at UNC) and I got the sample of 500 names for the current study: 1) A few months ago, we started looking at the reverse-chronological list of entries on blogs.msdn.com. 2) Then we looked for any comments on those blogs. 3) The next step was to see if commenters made their contact information available. 4) We did this until we found 500 leads -- 179 were e-mail addresses and 321 were Web contact forms.

The idea behind choosing the sample in this way was to survey a group that had at least a minimum level of interaction with an organization (in this case Microsoft, chosen solely for the magnitude of its online presence) and some minimum level of exposure to their organizational blogs. I wanted to get enough responses to run some statistical models.

Since there are so many ways to calculate response rates (and since these methods don't always translate well from mail and phone surveys to online surveys), I'm just going to include a table here with the approximate figures I have after just a quick look at the data set:





























Outcome
#
Undeliverable e-mails and inoperative Web forms (approximate)
48
Direct refusals via e-mail
4
Partially completed surveys
48*
Completed surveys128
Nonrespondents/unknown (approximate)
272
Total
500

So even by conservative estimates, the survey had about a 26% completion rate
(128/500). I'm guessing that for many of the analyses, I'll be working with
about 140 cases out of about 450 that I assume received invitations. So depending
on how you look at it, the "response rate" is somewhere between 26%
and 31%. (I'll be in a better position to give details after I start "cleaning"
and analyzing the data set.)

Most respondents who did e-mail me were very gracious and supportive, even
in cases where they were questioning the nature of the research and the deisgn
of the study. On the whole, MSDN bloggers and those they interact with are an
amiable group.** I've got to admit that surveying this group - a bunch of people
with a lot more computer expertise than me - made me a bit nervous.

It's one thing when someone sends out SPAM behind a cloak of anonymity. It's another
when you attach your name to a real request for help. Only two people responded
with irate, mean-spirited e-mail. I'd love to report their names and include
their e-mails here, just to reveal their nasty side. (One bills himself as a
consultant on e-mail ettiquette among other things - I hope that's not his day
job.) The irony is that I've got a responsibility to protect their identities,
while they freely attack me for disclosing mine.

*Many of these 48 responses were almost complete and provide usable data for the
analyses I'll run; others are people who just glanced at the first page, and
perhaps came back later to complete the survey.

**Lots expressed interest in hearing about the study's progress - hence this post - and one respondent did ask me to link directly to his blog: Michael Teper.