A few months back, we started research on viewing Recruitment as a Forecasting function. Conceptually, this made a lot of sense as Recruiters are increasingly handling large volumes of data and are sampling from a pool of profiles based on a set of constraints (skills required, location, time to hire, wage limits, and a host of criteria). Imagining a Recruiter as a Forecaster is a natural progression of the job – but with it comes a great form of responsibility. It requires a more profound sense of self-investigation and self-awareness.
The good news about Forecasting is that it is something one can keep improving on – provided there is a set of systemic steps and processes in place. In Superforecasting, Bill Flack, a retired USDA employee who lives in Kearney, Nebraska, is contrasted with famous media pundits and other experts who purport to predict everything from hot stock market picks to presidential politics. Bill Flack has proven himself super accurate across 300 geopolitical questions by going through a very disciplined data collection and analysis process.
Forecasting is a great skill that be acquired in the digital age and need not be reserved for a few pundits. The problem with pundits is, while they may tell a good story about a possible future, they are seldom held accountable for their predictions (who has really checked the accuracy of every forecast of, say a Tom Friedman)
Imagine the power of being so precise in finding and predicting will be the best fit for an organization – that is, Super Recruiting
But this journey is not easy. This email is an attempt to unpack a framework that will be useful for you. In our framework, there are many elements, but we have written about two themes.
- Handling Bias
- Making Talent Intelligence Relevant for execution
(It is an eight-point framework – here I have addressed two – if you need the full framework, please write to us, or we will cover those in subsequent emails)
The following excerpts are from the University of Minnesota dissertation (author Siwen Shu). Shu states that despite the extensive research that statistical) combinations of predictors are consistently equal to or superior to human judgment (e.g., Grove, Zald, Lebow, Snitz, & Nelson, 2000); most decisions made in work settings are heavily based on human intuition. Various explanations have been offered to explain the resistance to statistical decisions. Besides apparent factors, such as lacking knowledge and education of statistical methods (Vrieze & Grove, 2009) and belief in human intuition (Highhouse, 2008), there are contextual reasons for this resistance. In an employment setting, the selection is fundamentally a sociopolitical process (Cleveland & Murphy, 1992). Organizational factors, such as politics and culture, also promote the use of human judgment while discouraging the use of purely statistical methods (Johns, 1993; Muchinsky, 2004; Terpstra & Rozell, 1997).
I am sure all Recruitment Organizations are talking about Bias and how not to be biased. Last week, my interviews with six recruiters showed that there is very little knowhow on how to handle the same while there is awareness towards this. As a result, I poured myself this week across publications to highlight the Bias that Recruiter may face. The research papers on Bias are overwhelming. We have abstracted this into four categories. A mere understanding and acknowledgment can dramatically improve decision making (Daniel Kahneman)
Anchoring Bias: A recruiter goes through a profile that they like, and mind gets fixated on that. Every other profile is compared to this profile. Daniel Kahneman – a Nobel prize winner, states that having simple tools like Checklists (as to why I liked a specific entity or profile) is a very simple way to overcome. If you are not able to document this checklist, then you repeat the experiment discarding the profile you liked
Confirmation Bias: This is a prevalent form of Bias – a term used in many movie courtroom scenes. A Recruiter develops a very initial idea about a candidate and collects additional details to justify the same. Recent Job Experience is often a data element that triggers Confirmation Bias. You find a Software Developer working at, say, Apple; then your mind has already elevated the candidate. You find facts around that to justify. The best way to handle it is to take a break and arrive at it from a different starting point. May be a patent or education or reference and so on and see if the same decision will be made
Dunning – Kruger Effect: This is more popularly known as the Hindsight bias. When a Recruiter gets feedback on a certain hire not working out, the Recruiter merely amplifies that thought they had that the candidate may not work out and have a mindset around “knew it all along.” Baruch Fischoff was the first to document this through awesome experiments. Fischoff proved that post the Soviet Union dissolution in 1991; many experts claimed that they knew this is coming (the same experts did not predict that just a few years ago in 1988). This effect overstates one’s own ability and limits the process of understanding where we went wrong. Super Recruiters do not do this and decompose the feedback
Cognitive Dissonance: More than 60 years ago, Leon Festinger made a modest proposal by suggesting that people who hold two or more cognitions that are psychologically inconsistent experience a state of psychological discomfort called cognitive dissonance. As a Recruiter, we are in two minds when extending an offer – sometimes. A part of us states that this is a good idea, and a part saying this is a bad idea. Discussing this in a team meeting using agile frameworks can be very useful to minimize Cognitive Dissonance
Super Recruiters often collect a lot of data but organize it in their own way. This could also be a team effort. Here is an example of how to assemble all the reviews, Job Descriptions, Types of work that a company conducts and build a SWOT framework. This SWOT Framework is for UBER and can be very useful while trying to hire candidates from Uber, for example