As a person who appreciates insights into the thinking of various actors in higher education, I get excited whenever a newspaper invests its resources in fielding a large survey of college faculty, administrators, students, or those who hire them.
Thus, I was initially excited to see the results from the latest survey conducted by the Chronicle of Higher Education and America Public Radio's Marketplace, on how employers view the skills and talents of recent college graduates. Reading through the results, one began to think the story contained provocative information on business's view of online education, for example.
But then I got to the story's end, and stopped cold. There lay the description of survey methodology: The findings on these pages come from a survey developed, fielded, and analyzed by Maguire Associates Inc., a higher-education consulting firm, on behalf of The Chronicle and American Public Media's Marketplace. Maguire invited 50,000 employers to participate in the study. Experience.com, a career-services consultancy, helped develop the sample by providing a contact list of employers that recruit recent college graduates. The survey was conducted in August and September 2012. There were 704 responses.
First, that's the sum total of all details we're given about the sample. We know nothing about the characteristics of the sampling frame or the respondents. In other words, we have no idea whether these employers are representative of specific sectors, specific states, etc. And equally importantly, while the author didn't do the math for us, a simple calculation reveals a response rate of just 1.4%.
This is insanely low. So low, in fact, that I don't think the results should be published at all-- let alone featured on the Website. And if they are, a major banner is needed proclaiming at the start of the story, "PROCEED WITH CAUTION!" I said the same thing regarding an Inside Higher Ed survey last year, and that one had a response rate of 7%.
Yes, I understand that surveys are expensive to field and really costly to do well, but at the very minimum the public deserves far more clarity and transparency in reporting on methods than the Chronicle has provided here. In addition, there are clear best practices for fielding web surveys (which I have to assume this was) and it sure doesn't seem like Maguire Associates followed them. For example, Paul Umbach of North Carolina State wrote a nice piece specifically for those conducting higher education research that lays out the issues point by point. Given that low response rates may lead to sample bias (affecting generalizability), low power, and inaccurate estimations, the researchers should have randomly sampled from the target population only enough people to have sufficient power-in other words, going after a fraction of those 50,000 with solid methods instead of surveying all of them with weak technique. And the overarching goal should have been to achieve a decent rate of response to reduce the possibility of a non-representative sample-- in this case, the response rate is pathetically low (this is as low as I think I've ever seen) and no information on representativeness is provided. This doesn't seem uncommon for Maguire Associates-- this 2011 survey report says nothing about their sampling or response rate, nor does this one from 2010, but this one from 2005 and this one from 2008 provide at least some of the information that Chronicle readers deserve.
It's unethical to publish this information like this. So I recommend that the Chronicle do what I asked Inside Higher Ed to do-- take it down! And consider working with a more credible survey firm next time, please.