Pew Research reported recently that telephone response rates for survey research were about 9% in the run-up to the 2012 presidential election, a record low. Survey response rates are also declining in academic research, a topic of one of the more provocative (of the presentations I attended, at any rate…granted, that’s a small and subjective subset) presentations at the 2012 American Marketing Association’s Summer Marketing Educators Conference.
Low response rate isn’t necessarily an issue with respect to the integrity of data (and Pew and others make that point), though flags often go up when investigators report response rates in the single digits. The larger issue (to my beady little mind, anyway) is the time and expense represented by paltry response rates. Having recently completed data collection for my own dissertation — a process involving a business-to-business survey — one is free to ask me how I know about the time and expense, um, thereassociated.
(Yes, it was available online as well as in paper-and-pencil form.)
It appears that people are less willing to sit still long enough to complete a survey. Perhaps they perceive themselves as more pressed for time; perhaps push polling and other questionable techniques have soured respondents’ stomachs for the work; perhaps the incentives offered, when offered, are insufficient or off base. Whatever the reason, it poses a nontrivial challenge for those of us who like to do quantitative research, especially on the B2B side. As I move into my career as a teacher-scholar, I will have to become more creative and innovative about obtaining data for my research, in order to produce research with useful insights for practice and scholarship, as well as to address my own career goals and requirements. That’s a good thing in the long run, if a little bloodcurdling in the short.