Five ways that people data confounds us
Recently I was interviewed on a podcast, and I was asked how someone with a Pure Mathematics background could have gotten involved in a 'grubby business' like People Analytics.
Of course the interviewer was asking in jest, and what he was aiming to illustrate by his question was that there is very little 'purity' involved when you are analyzing people. A person is the most complex thing on this planet - biologically and psychologically - and that is why I love what I do. Statistics and mathematics are powerful tools, but they are constantly challenged when people data is involved.
Here are a few ways that people data constantly confounds us:
1. Accuracy and Reliability
The vast majority of people-related data in organizations is collected through processes that are slaves to judgment, compliance, circumstance and environment, all of which are laced with bias and error. Our medical analytics colleagues can rely on the scientific accuracy of a test for ketones in urine or leukocytes in blood, but we need to rely on Amira's test score on a day when she was very distracted, or Joe's performance rating when his supervisor was in a bad mood.
2. Discrimination
Particularly in employment contexts, many of the measures that People Analytics professionals deal with are not useful mathematical differentiators. Performance metrics have a tendency to 'glorify the average', often with as few as 10% of people falling into the extremes. Attitudinal measure generally creep to the right, so that even an 'above average' rating can be considered a cause for drastic action.
Oh how I long for a nice neat bell curve!
3. Range restriction
Even if I did get my nice neat bell curve, range restriction is the constant scourge of the Psychometrician. I have spent my entire career taking input measures on people for whom only a very small proportion will get to the point where I can get an output measure.
Recently I drew a 'progression funnel' for someone I was advising, describing how 100% of people might make a job application to their organization, but as they progress through the various stages of screening, interviewing, joining, being promoted, etc, that number reduces to ridiculously small levels such as 0.5%. I explained my general 'common sense' rule of 'never try to correlate/validate more than one step ahead'. They told me that their leaders wanted to validate 5 steps ahead. Good luck with that!
4. Imbalance
As I moved from pure Psychometrics into advanced analytics a couple of years ago, I started to see the mathematical challenges with dealing with people data and modelling people processes. Related to the previous point, modelling highly selective processes creates an imbalance challenge - you have a lot of data to learn about who fails, but much less to learn about who succeeds.
5. Everyone else knows the answer
While you are twisting your dedicated and committed brain around points one thru four, don't fear! Numerous experienced interviewers, managers, etc already know the answer to your problem. They have hired three people and they all have this one thing in common, whatever that is!
But isn't it those challenges that make us love what we do? Why work in a field where the mathematics is entirely pure and predictable? I did that a few years back and, let me tell you, I have a lot more fun nowadays.
Comrades in People Analytics, don't despair! We are all facing similar challenges as pioneers of our discipline. Keep the faith, maintain the community and who knows where it might lead!
I lead McKinsey's internal People Analytics and Measurement function. Originally I was a Pure Mathematician, then I became a Psychometrician. I am passionate about applying the rigor of both those disciplines to complex people questions. I'm also a coding geek and a massive fan of Japanese RPGs.
All opinions expressed are my own and not to be associated with my employer or any other organization I am associated with.
Industrial/Organizational Psychologist experienced in Employee Selection, People Analytics, Organizational Surveys, & Performance Management
7yKeith, these are very good points. In the employee selection/promotion context, imbalance works in the other direction. You only learn about those who are hired/promoted so it's easy to fool yourself into believing you're only making Type I errors. We almost never learn whether those we pass up might have been successful.
Modular Housing Evangelist | Ecosystem enabler | ex-CMHC innovation & partnerships advisor | Fractional advisor | Proud generalist | Perpetually curious
7yThe inferential engine called the human brain, stores and correlates far more data about what people think, feel and perceive. To get good data through people analytics, at best, is still an exercise in learning about how people feel. The expression of feelings is muddled by what people feel comfortable saying about how they feel. The data about what people perceive and do remains in the realm of business process which, traditionally, HR has no clue about. Good use of AI and analytics would necessarily need to address the need for a better view into what people do.
Senior Global Talent Operations Director at Essence
7y✋🏼
💰Co-Founder of Reearthers | Building the Expert Network Platform for Carbon Dioxide Removal Developers | Customer Led Growth
7yWhat a great article, thanks Keith! Matt
I help waste & recycling companies recruit great people who generate bigger revenues and improve operational performance
7yEnjoyable article, as always, Keith!