Yesterday we looked at the cost associated with falling behind on the TA tech and process curve. Today we’ll look at how that plays out in the real world, by comparing a recruiter using traditional methods vs. one leveraging recruitment process changes.
(A note: the individuals in these examples are by no means representative of the entire spectrum of recruiters and their strategies, though they are based on real feedback from recruiters and hiring managers.)
Recruiter 1: Jim is our classic, traditional recruiter. He’s got a knack for feeling out the best candidates from a six second resume screen, and excels at getting the information he needs from a phone interview.
Recruiter 2: Sarah is a digital age recruiter. She makes use of tech solutions and process changes to assist her in the sourcing and decision-making process.
Both Jim and Sarah are working to fill a middle-management role as quickly as possible, without sacrificing quality of hire. Let’s look at how they accomplish this.
Recruiter 1: Jim posts his application across the web: Indeed, Monster, LinkedIn - all the major job posting sites. Since he’s in a bit of a time crunch, he makes use of sponsored postings on these sites in order to get as many eyes on the application as possible.
Recruiter 2: Sarah also posts her application on the major hiring channels, but she also reaches out to past applicants that have been kept engaged with her Candidate Relationship Management software. With this she is able to source from a pool of candidates that have expressed interest in her organization in the past, but may not have been hunting for a position at the time.
Recruiter 1: With a substantial number of candidates pooled into his ATS from various online channels, Jim begins the screening process. He starts sorting candidates by years of experience, then by experience in particular industries. From these he creates a list of candidates to complete a pre-employment assessment, and invites each to do so.
Recruiter 2: Sarah’s candidate pool is substantially deeper than Jim’s due to her ability to source from applicants not looking for immediate employment. She invites every candidate sourced this way to complete an on demand video interview that doubles as a pre-employment assessment.
Recruiter 1: Like most pre-employment assessments, Jim’s invite received a lukewarm reaction. Clocking in at 400 questions, the assessment resulted in a 30% candidate drop off (job applications completion rates drop by 50% when an application asks more than 50 questions). Of the candidates who completed the assessment, Jim invites those with the best scores to schedule a phone screen.
In every phone screen, Jim scores the candidate’s soft skills, like communication ability. He forwards the best candidates’ scores and resumes to the hiring manager.
Recruiter 2: With a mixture of I/O Psychology and data science, the video responses of Sarah’s candidates are scored against a model built on the responses of her organization’s top performers. Sarah watches the best of these to get a feel for who each candidate is and evaluates their soft skills. Data in hand, she forwards the best candidates’ interviews to the hiring manager for review.
Recruiter 1: Jim works with the hiring manager and candidates to identify the best times to hold an in-person interview.
Recruiter 2: With interview management software, Sarah’s candidates are able to match their availability with that of the hiring manager.
Recruiter 1: After a final decision is made, Jim reaches out to those that did not make the cut and informs them that his organization has “moved forward with another candidate.”
Recruiter 2: After a final decision is made, Sarah reaches out to those that did not make the cut, informs them that her organization has moved forward with another candidate, and encourages them fill out a generic application. Where applicable, she provides feedback and identifies current openings that would be a good fit, based on their performance in the video interview.
Let's examine the difference in graphic form, where the number of candidates left in the running after each step is roughly represented by the size of the circle:
Compare that to:
The times presented in the above example are based on customer feedback and recruiter anecdotes, and were not arrived at with any sort of scientific rigor. While shrinking time to fill by more than 50% is pretty much the norm with HireVue, obviously every situation is different.
Speed of Hire: If both of the recruiters in the above example were vying for the same candidates, our digital-age recruiter would have hired the best performer before the traditional recruiter finished compiling the results of his assessment.
Quality of Hire: In terms of quality of hire, it stands to reason that the assessment built specifically for the second recruiter's organization would have just as much (if not greater) validity when analyzing each candidate. And since Susan pulled from a candidate pool that expressed prior interest in her organization, the eventual hire is likely to have a higher level of engagement on the job.
Candidate Experience: As far as candidate experience is concerned, there's no competition. Between the 400 question assessment and multiple rounds of interview scheduling, the traditional hiring process is not built with the candidate's best interests at heart. And, of course, the best experience for the candidate is the one that gets them the quickest response.