top of page


  • Tech Journalist

AI-based hiring algorithms warned for biasing

The purpose of hiring algorithms is to assist human resources hiring managers. AI-based hiring algorithm reduces the amount of time they spend reading resumes that do not match job requirements. Now, hiring managers can rely on a program to automatically filter candidates to find those who possess a particular professional certification, rather than manually searching through resumes to determine which candidates have it.

There are three types of machine learning algorithms: supervised, unsupervised, and reinforced learning. Most hiring or talent acquisition algorithms learn in a supervised or unsupervised manner. Thousands of CVs of candidates are fed into the machine learning algorithm and the ones who "got the job" as the desired outcome. Machine learning develops algorithms based on those cases and determines which profiles are best suited for specific vacancies.

As shown in a survey conducted by LinkedIn, 67 per cent of human resource professionals working in recruiting and hiring believe that artificial intelligence helps save time. But there are some drawbacks also.

The concerns begin as the hiring algorithm learns from experience. The algorithms will know from previous practices, including "biased" decisions. Those biases are then repeated and amplified on a large scale. Researchers have found that even independent hiring algorithms can be biased. Algorithms can penalise applicants for having a name that sounds African-American, mentioning a college for women, and even submitting their resumes using particular file formats. People who stutter or have a physical disability that limits their ability to interact with a keyboard may be disadvantaged by them.

According to a Reuters report, an algorithm being tested as a recruitment tool by the online retailer Amazon was sexist and had to be scrapped. It was claimed that the artificial intelligence system was trained on data submitted by applicants over a decade, which came from men. The system began to penalise resumes that contained the word "women." According to Reuters, the program was modified to make the term neutral, but it became apparent that it was impossible to depend on the system.

Some AI-based tools predict a candidate's personality, experience, and skills. These include Facebook, Twitter, and LinkedIn data. If a candidate applies for a tech position and has no tech-related social media data, the system will downgrade them.

In a press conference, attorney general for civil rights Kristen Clark said that Nowadays, they're sounding the alarm about the dangers of blind reliance on AI and other technologies.

Federal Officials Warn Employers Against Disparate Hiring Algorithms

New laws and regulations are attempting to limit the rapidly expanding but often opaque use of artificial intelligence (AI) to find and hire new employees. In 2020 and again in 2021, the Federal Trade Commission provided businesses with comprehensive guidance on how they may use algorithms.

Some key regulations are - Do not mislead consumers regarding your use of automated tools. AI typically operates in the background, somewhat apart from the consumer experience. However, when interacting with customers using AI tools (e.g., chatbots), take care not to mislead them about the nature of the interaction. Sustain openness and transparency when gathering sensitive data.

A person may be required to provide a "notice of adverse action" if the person makes automated decisions based on information from a third-party vendor. If a person uses algorithms to assign risk scores to consumers, they must also disclose the key factors that influenced the score in descending order of importance.

Suppose a person provides data about consumers to the others to make choices about consumer credit facilities, employment, insurance, homes, government aid, check-cashing, or similar transactions. In that case, that person may be considered a consumer reporting agency and be required to comply with the FCRA. One of the compliance requirements with the FCRA is ensuring that the data provided is accurate and up to date.

bottom of page