Employers increasingly use artificial intelligence (“AI”) hiring programs to weed out applicants. These programs are the beginning stages of automating the recruiting arm of human resources. Learn how AI recruiting software and using artificial intelligence in the hiring and selection process can discriminate against applicants in protected classes.
Employers increasingly use artificial intelligence (“AI”) hiring and recruiting software to weed out applicants. These programs are the beginning stages of automating the recruiting arm of human resources. AI recruiting software may ultimately displace the role of the hiring managers altogether. In fact, Undercover Recruiter predicts AI will replace 16% of HR positions by 2028.
These hiring programs use algorithms to sift through applications submitted to public job postings on sites like monster.com and indeed.com. However, they go much deeper than screening candidates based on experience or educational background. AI has the capability to mine candidates’ social media posts to determine their political or social persuasions. AI can even access databases containing information on their spending habits or voter registration. In some cases, the role of the human interviewer is replaced with AI virtual interviewers (a.k.a. recruiter chat bots). These bots ask questions and evaluate candidates’ word choices, speech patterns, and facial expressions. The same biometric and psychometric technology our intelligence services use when analyzing these markers.But, can an artificial intelligence hiring selection program discriminate against applicants in protected classes? Sure they can; in two ways, actually.
First, artificial intelligence software does not architect itself. Software programmers, project managers, and quality control personnel design these programs. Upper-level human resource personnel and C-suite executives modify and approve this AI software. Along this continuum of development, individuals with biases leave their fingerprints. AI hiring software also unwittingly learns an organization’s previous biases when it analyzes the historical hiring data. This impacts future elimination of candidates.Second, even if researchers create and implement artificial intelligence programs in a perfect vacuum free of bias, these programs could just as equally violate discrimination laws. This occurs when the program has an unintended, disparate impact on particular protected categories. For example, an algorithm that excludes applicants with GEDs might have a disparate impact on minority candidates, regardless of whether that was the intent.
The obstacle becomes how to prove that either of these unlawful scenarios occurred. Employers mask the development, tweaking, and purchasing of AI software behind attorney-client privilege and work product. They are tightly guarded secrets, no different than the Kentucky Fried Chicken recipe. Fortunately, we at Van Kampen Law are familiar with these sorts of countermeasures. We encounter these measures when we deal with challenging stack ranking evaluation systems or mandatory turnover quotas. However, once we obtain those pertinent documents, the person bringing the lawsuit will need to engage an algorithmic hiring expert and depose the scores of individuals involved in the AI hiring program’s development and implementation. Fortunately, established firms, like Van Kampen Law, maintain substantial monetary reserves and have access to lines of credit to finance this kind of complex litigation.If you have been denied a position as part of an AI selection process, reach out to Van Kampen Law so we can assess whether your race, age, sex, national origin, disability, sexual orientation, or litigation history may have been factors in your non-selection.
Read the latest from the Van Kampen Law team
Practitioners in our field have grown accustomed to seeing others’ dismay as they discover that Title VII does not bar sexual orientation discrimination, but now it has changed. On April 4, 2017, the full 7th Circuit U.S. Court of Appeals, in an 8-3 decision, ruled that sex discrimination extends to sexual orientation.
Read MoreThe repeal of House Bill 2 ends a year of high drama in The Old North State, but many challenges remain for the LGBTQ community. Gone is the most odious provision of this notorious law. The clause prohibiting anyone from using a restroom other than that which corresponds to their birth certificate is history.
Read MoreDos Equis beer maker Cuauhtémoc Moctezuma Brewery doesn’t always discriminate, but when it does, it appears to do so on the basis of age. Last week, the beer maker announced that it was replacing its 77-year old most interesting man spokesman, Jonathan Goldsmith, with 41-year old French actor.
Read More