Top Stories
Background Checks And AI Are Not Mixing Well Causing Lawsuits
March 28, 2019 posted by Steve Brownstein
Background Checks And AI Are Not Mixing Well
One night last November, Seth Tucker ordered dinner from one of his favorite fast-food chains through the delivery service DoorDash.
The restaurant was a short drive from his Concord home, but the DoorDash driver traveled all the way from Manchester to pick up the order and drop it off. Tucker and his wife recognized the need for more drivers, and saw an opportunity.
“We thought it was cool — this is the whole side-gig thing that people talk about … we could use this to help pay for our wedding, pay off some debt,” he said.
Tucker submitted an application to become a DoorDash driver and a day later he received the results of his required background check from a California company called Checkr.
“Right away, I was like, ‘No, this can’t be accurate,’ ” he said. “I freaked out.”
According to Checkr, Tucker had been convicted in Maine for causing a fatal car crash while driving drunk. It was an obvious disqualification for his DoorDash application, and none of it was true.
Since its founding in 2014, Checkr has become a darling of gig-economy giants like Uber and Lyft thanks to the speed of its services. In a press release last April announcing a $100 million influx of cash from investors, the company boasted that it processes more than 1 million background checks each month for more than 10,000 clients.
Around the same time as that announcement, co-founder and CEO Daniel Yanisse appeared on CNBC to describe how Checkr’s use of artificial intelligence to increase speed and accuracy was disrupting the background check industry.
“Background checks have been really slow, inaccurate, and not an efficient process,” he told the interviewer. “And so we brought modern software and technology to really make background checks efficient, fast, transparent. And so that was a big reason why the gig economy — the customers you mentioned, Uber, Lyft — have been able to grow so fast.”
More than 40 lawsuits
But as it chalks up clients and funding, Checkr has also been collecting something else: lawsuits.
More than 40 people have sued Checkr for violating the Fair Credit Reporting Act in recent years, according to federal court records, and in December, the company settled a class-action lawsuit alleging that it illegally included information about low-level offenses like traffic infractions on background checks for more than 96,000 people.
Checkr agreed to pay out $4,460,000 in damages plus attorneys fees.
Checkr representatives did not respond to written questions or multiple requests for comment.
The company has also settled many of the individual lawsuits brought by people like Tucker, who claim that Checkr included erroneous criminal convictions or expunged records on their background checks, leading to lost job opportunities.
It is impossible to know how many other people were fired or weren’t hired due to inaccurate reports, but decided not to contest it in court, or how many actual criminal records Checkr might have missed.
False felony reports
In a case settled in November, a California man named Steve David Ford alleged that he was denied a job with Uber because his Checkr background report included a felony conviction for possession of a controlled substance. The conviction was actually of a Steven Monroe Ford, according to the lawsuit.
A Georgia woman claimed in her lawsuit that she had been driving for Lyft and Uber for two years, during which time another background check company had provided the companies with clean reports about her record. But in 2018, Checkr sent background checks to the two rideshare companies and Postmates that incorrectly stated she had been convicted of felony assault. The companies kicked her off their platforms.
Some of the lawyers who have repeatedly sued Checkr claim that the errors occur because there is no human oversight of the process.
No human oversight?
The Virginia-based law firm Consumer Litigation Associates has represented several clients suing Checkr. In their complaints, the firm’s attorneys write that documents released during discovery will show that Checkr uses “webscrape technology” to gather records from court and county websites and then “compiles this information, without checking the accuracy of its computerized record gathering and populates the information into reports … with no human being involved in the compiling, matching, and reporting of criminal-history information.”
Attorneys for Consumer Litigation Associates did not respond to a request for comment.
Artificial intelligence and machine learning technology are increasingly becoming a part of the employee screening process, said Angela Preston, chairman of the National Association of Professional Background Screeners and a senior vice president at Sterling Talent Solutions. AI tools can be trained to quickly identify and categorize records and spot errors a human might overlook, such as a small difference in the spelling of two names.
“There is a concern that without the proper checks and balances in the system, technology could be getting it wrong,” she said, but “you also have to think about the proportionate volume of searches that any company is running” and the resulting number of mistakes.
Checkr’s stated rate of 1 million background reports processed each month translates to about 23 reports per minute — likely an impossible pace for a human reviewer to keep up with. And even with human oversight, there’s opportunity for error in any background check, if for no other reason than that criminal and other public records often contain slight spelling mistakes and inaccuracies.
For all AI’s undeniable potential to improve screening accuracy and put millions of Americans to work faster — last year, a Gallup study estimated that 36 percent of U.S. workers participate in the gig economy — the new technologies can also introduce unpredictability in the process for workers, said Jeremy Gillula, the tech projects director for the Electronic Frontier Foundation, who researches AI and machine learning.
“I would feel comfortable using AI as a tool of first pass, but I would want any records it digs up to be verified by a human,” he said. “One of the weaknesses of machine learning models … is that they can make mistakes in unexpected ways and in ways that a human wouldn’t think about.”
Those mistakes can be hard to explain without a deep look at the underlying algorithms.
In the course of less than a month in 2016, Checkr provided background checks on Gregory Howell of North Carolina to three different companies.
The first stated that Howell’s driving record could not be found but was somehow also suspended, according to a lawsuit he filed. The second report, sent four days later, allegedly contained the correct information: He had held a valid license since 1998. The third report, sent two weeks after that, stated that Howell’s driving record began in 2016.
Two of the reports also contained information about traffic infractions that were more than seven years old, a violation of the Fair Credit Reporting Act. That allegation turned Howell’s case into the $4.5 million class action lawsuit settled late last year.
Inclination to settle
While Checkr did not admit to wrongdoing as part of the settlement, it did agree to suspend its practice of including low-level offenses older than seven years on background reports for at least 18 months.
Bringing a class action lawsuit against the company based on its inclusion of inaccurate information would likely be harder, said California attorney Stephanie Tatar, who has filed nearly a dozen lawsuits against the company, because the nature of the mistakes varies so widely. And it may not be worth the effort — Checkr has shown an inclination to settle these lawsuits in a matter of months rather than fight them, according to a review of court records.
After Tucker received his inaccurate background check from the company last November — the one that said he had killed a person while driving drunk — he immediately tried to dispute it, he said.
Checkr sent him a copy of the same report in response, Tucker said, while DoorDash’s dispute resolution email address returned an automatic response stating that it was no longer being monitored.
But within two weeks of Tucker filing his lawsuit last month, Checkr corrected his background check and DoorDash offered to reconsider his application.
Attorney Michael Agruss, who is representing Tucker, said one of the company’s lawyers has also reached out regarding the case and he expects to settle it quickly.
“I think it’s almost just a cost of doing business for them,” Agruss said. “For every case they have to settle like Seth’s, the amount of money they’re making with these reports far outweighs it.”