Bad Data

Last week, the polling companies were under attack for failing to accurately forecast the results of the 2015 UK General Election.  This story has been like catnip to the research community, struggling to suppress our grins when told that the results of supposedly comprehensive large surveys do not reflect reality. If you’d like to know more about the pollsters inability to predict a Conservative majority, the BBC do a great job of explaining things here

Current Big Data and quantitative research in recruitment is ideal when you need a broad brush view of what’s going on in your sector and a general understanding of sentiment. But if we’ve learned anything in the past week, it’s that bad samples make for bad knowledge.  When it comes to research, if you’re listening to the wrong people it doesn’t matter how many respond, you’re not going to learn the right stuff. And the more specific your needs, the more important good data is.

When running recruitment campaigns to target specific talent, the quality of external perception research you conduct prior to campaign launch is key to saying the right things to the right people in the right way. This is especially true when the theoretical and actual talent pool are rarely the same as is optimistically assumed.  For example, you have been told that a local talent market contains five hundred engineers and there is a requirement in your power generating company to hire ten.  That should be relatively easy to hire from right?  But it’s not; no matter how much you advertise, you’re not getting the right applicants.  So you commission a poll that samples the opinions of over two hundred and fifty people working in engineering within a 25-mile radius and guess what? The survey tells you your salary is competitive, your employer brand is sound … there is no obvious reason why people don’t want to work for you.

Only here’s the thing – you need a specific sort of engineer.  You need electrical engineers specializing in industrial power generation (and only 40 of these actually exist nearby, of whom 15 already work for you).  And the survey sample thrown up by your third party data provider? Well that contains the thoughts of gas installation engineers, telephone engineers, software engineers…. In fact, there’s only one response from an engineer even vaguely relevant. She’s worked for you before and it didn’t end well.  You’re left with some impressive sounding numbers and hopefully some attractive pie charts, but you’re still no further forward in understanding how to attract the talent you need. You know nothing about the cohort with a relevant (and hopefully educated) opinion on your employer experience.

When understanding the needs of target professional hires, relevant opinions are rarely easy to come by. Finding the ideal participants to take part in an online poll in a specific market is usually a matter of luck over judgement.  Very few data sets are nuanced enough for the specific qualifications of key professionals in most locations. Finding their opinions requires a researcher that can go one step further.  It’s not good enough to be able to put together a terrific interview script and hold a conversation – your researcher also needs to be capable of sourcing and reaching relevant talent, approach participants with credibility (often at antisocial hours) and provide qualitative analysis that is balanced and nuanced. Twelve opinions might not sound as impressive as two hundred and fifty (and given the time and incentives required to find relevant participants the cost might even be more).  The question is, which piece of research would you trust to deliver your organization the most value?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s