5 ways that people data confounds us - Analytics in HR

5 ways that people data confounds us

Recently I was interviewed on a podcast, and I was asked how someone with a Pure Mathematics background could have gotten involved in a ‘grubby business’...

Recently I was interviewed on a podcast, and I was asked how someone with a Pure Mathematics background could have gotten involved in a ‘grubby business’ like People Analytics.

Of course the interviewer was asking in jest, and what he was aiming to illustrate by his question was that there is very little ‘purity’ involved when you are analyzing people. A person is the most complex thing on this planet – biologically and psychologically – and that is why I love what I do. Statistics and mathematics are powerful tools, but they are constantly challenged when people data is involved.

Here are a few ways that people data constantly confounds us:

1. Accuracy and Reliability

The vast majority of people-related data in organizations is collected through processes that are slaves to judgment, compliance, circumstance and environment, all of which are laced with bias and error. Our medical analytics colleagues can rely on the scientific accuracy of a test for ketones in urine or leukocytes in blood, but we need to rely on Amira’s test score on a day when she was very distracted, or Joe’s performance rating when his supervisor was in a bad mood.

2. Discrimination

Particularly in employment contexts, many of the measures that People Analytics professionals deal with are not useful mathematical differentiators. Performance metrics have a tendency to ‘glorify the average’, often with as few as 10% of people falling into the extremes. Attitudinal measure generally creep to the right, so that even an ‘above average’ rating can be considered a cause for drastic action.

Oh how I long for a nice neat bell curve!

3. Range restriction

Even if I did get my nice neat bell curve, range restriction is the constant scourge of the Psychometrician. I have spent my entire career taking input measures on people for whom only a very small proportion will get to the point where I can get an output measure.

Recently I drew a ‘progression funnel’ for someone I was advising, describing how 100% of people might make a job application to their organization, but as they progress through the various stages of screening, interviewing, joining, being promoted, etc, that number reduces to ridiculously small levels such as 0.5%. I explained my general ‘common sense’ rule of ‘never try to correlate/validate more than one step ahead’. They told me that their leaders wanted to validate 5 steps ahead. Good luck with that!

 

Click here to continue reading Keith McNulty’s article.


Join the Conversation