Job seekers often spend hours online researching employers and polishing their applications and résumés. Then they hit send.
And they hear nothing. Ever.
Looking for a job is hard enough without being rejected by a robot. But applicant-screening and tracking systems are increasingly powerful job-market gatekeepers. After scanning résumés, they hurl most applicants into a digital black hole.
These machine-learning systems save time and money for employers swamped by online applicants, and they could potentially reduce bias in hiring. But the tools also risk magnifying employers’ existing prejudices and rejecting worthy applicants. Most vulnerable are the most active job seekers, such as recent college grads looking for entry-level positions or older workers idled by layoffs.
“It’s a hot-button issue with college students,” prompting eye-rolls and cynicism, says Mary O. Scott, a West Hartford, Conn., campus researcher and consultant who just completed a series of in-depth student interviews at 14 universities. One senior spoke of using her “trigger finger” to respond to hundreds of online postings, but she expects few if any replies, Ms. Scott says.
Savvy job seekers can improve their odds of getting past these gatekeepers by understanding how they work. Among valuable tactics: Spice up your résumé with specific on-the-job results, use meaningful job titles and tailor your choice of words to match companies’ requirements.
These systems scan résumés and applications for keywords showing hard skills, such as financial analysis or cybersecurity, and sometimes for softer skills, like team leadership. They may ask knockout questions for must-have attributes, such as whether you can work at a particular location. Some use text tools or chatbots to administer skills tests. Most disqualify applicants who don’t meet basic requirements, then list others in a ranked order, based on how well they fit the employer’s specs.
Some tools serve as job-market matchmakers. ZipRecruiter matches candidates and employers by scanning applicants’ qualifications and employers’ postings, tracking users’ behavior on the site and employing algorithms similar to those used by Amazon for suggesting products, CEO Ian Siegel says.
Rock Brouwer has hired many candidates ZipRecruiter has brought to his attention. “When I get one of those, it just makes my day,” says Mr. Brouwer, hiring manager for Pacific Service Center, a Portland, Ore., trucking-fleet repair company.
About 60% of employers admit such tools cause them to miss some qualified candidates, however, according to a 2016 survey of 1,200 job seekers and managers by CareerArc, a human-resources technology company, and Future Workplace, a research firm. Critics say the systems give too much weight to small differences between candidates.
They amount to a black box. “Often a job candidate doesn’t even know a system is in use,” and employers aren’t required to disclose it, says Sarah Myers West, a researcher at the AI Now Institute, a New York University research group. A new Illinois law will go into effect next month requiring employers to disclose and get consent for use of AI video-interviewing tools with job applicants.
Most vendors refuse to tell employers how their algorithms work. And most employers lack deep, accurate performance data.
The systems risk magnifying managers’ prejudices if those biases are reflected in the makeup of the employer’s current workforce, according to a 2018 study by Upturn, a Washington, D.C., nonprofit promoting fairness in the use of digital technology.
High performers may share traits that have nothing to do with job performance, skewing outcomes, says Mark Girouard, a Minneapolis attorney who advises employers on pre-employment screening. One vendor built a résumé-screening tool that tagged being named Jared and playing high school lacrosse as factors predicting success. “The system didn’t have a very deep set of learning data,” he says. The employer didn’t put it to use.
Even if employers and vendors aren’t trying to reject female or minority applicants, they still risk doing so if they train algorithms on data gleaned from a current workforce that lacks diversity. An employer with mostly male employees, for example, might inadvertently train a screening tool to downgrade applicants who participated in sports played mostly by women, such as field hockey.
One employer intent on reducing employee turnover found that people who lived closer to its offices tended to stay with the company longer. But screening applicants based on distance from the worksite turned out to be a proxy for race, resulting in a lack of diversity.
The systems can easily stack the deck against older workers, says William A. Rivera, senior vice president of litigation for the AARP Foundation. An employer who wants to hire applicants with three to five years’ experience can award candidates three points for three to five years’ experience, two points for five to seven years and one point for more than seven years, Mr. Rivera says. The result: The most experienced workers, who are also typically older than others, would likely receive a lower score and a lower ranking on a candidate list.
It’s sometimes possible to tell whether an employer is using an AI-driven tool by looking for a vendor’s logo on the employer’s career site. In other cases, hovering your cursor over the “submit” button will reveal the URL where your application is being sent.
Otherwise it’s best to assume a robot will be your first-round judge. To pass the test, use clear, functional job titles that reflect progress in your career, and prove your value by quantifying results in dollars earned or number of customers gained, says Robert Meier, chief executive of Restore Hope Resources, a Tampa, Fla., job-coaching firm.
Some applicants try to game the systems by choosing answers to knockout questions that are obviously desirable rather than accurate, says Jim D’Amico, president of the Association of Talent Acquisition Professionals. Others fudge their ZIP Code to make it look as though they live in the employer’s target area.
These ploys risk annoying hiring managers, Mr. D’Amico says. Candidates weigh the risks against potential rewards. “Some candidates think, ‘To know me is to love me. If I can just get in front of you, you’re going to love me,’ ” he says. “And sometimes that’s true.”
To Get Past the Robots…
* Network to build contacts inside the company who will put in a good word for you.
* Use a text-based app like Word for your online application, rather than a PDF or other format.
* Include in your résumé keywords and phrases from the employer’s job posting.
* Quantify past results, citing dollars earned or other stats.
* Camouflage brief gaps in work history by listing years only, rather than years and months.
* List job titles in a way that shows increasing responsibility and status.
Source: The Wall Street Journal