Daily Mail PH

Thursday, August 24, 2023

[New post] How AI Could Lead To Discriminatory Hiring Practices

Site logo image Crypto Breaking News posted: "The topic for this article was chosen by Decrypt readers via Snapshot vote! To help shape future Decrypt content, vote here. To learn more about AI and earn a free on-chain certificate, take our free Getting Started with AI course. Decrypt will cover the " Crypto Breaking News

How AI Could Lead To Discriminatory Hiring Practices

Crypto Breaking News

Aug 24

The topic for this article was chosen by Decrypt readers via Snapshot vote! To help shape future Decrypt content, vote here. To learn more about AI and earn a free on-chain certificate, take our free Getting Started with AI course. Decrypt will cover the gas fees for the first 10,000 mints.

With the recent dramatic uptick in artificial intelligence (AI) platforms and technology, disruptors are keen to find new ways that AI can automate a host of tasks previously performed by humans. Indeed, AI has gone so far as to make possible a range of projects that are impossible for humans to achieve on their own. Powerful AI tools can be used to write (and even act in) film and television projects, to generate sometimes-harmful recipe ideas, to reproduce music with the help of mind-reading, and even to provide life coach advice.

It should come as no surprise, then, that AI is seeing increased adoption by human resources branches of various companies as well as by firms dedicated to recruiting and hiring practices.

With the tremendous power and potential of AI come some very real threats and dangers as well. Among the immediate risks of AI is the likelihood that human biases and prejudices will become engrained—consciously or subconsciously—in the AI tools that we build. A recent peer-reviewed computer science paper found that popular large language models (LLMs) like ChatGPT demonstrate political biases, for instance. Biases may emerge as a result of the data that protocols developed using machine learning processes use. They may also crop up due to the biases of programmers, or because of poor calibration in the machine learning process, and even as a result of larger systemic biases as well. The problem is a growing and immediate concern for many both inside and outside of the AI space and leads to issues including housing discrimination and much more.

As companies have increasingly found ways to incorporate AI into their hiring practices, these issues have come to a head. Close to two-thirds of employees in the U.S. say they have witnessed workplace discrimination, including in the recruitment and hiring processes. Below, we take a closer look at the current landscape of AI and discriminatory hiring practices, as well as some broader concerns for the future of this space.

How Do Companies Use AI in Their Hiring?

It makes sense that companies would want to automate aspects of HR. Recruiting is time-consuming, expensive, and repetitive. AI is designed to process vast amounts of data with tremendous speed. 99% of Fortune 500 companies and 83% of all employers use automated tools of some kind as part of their process of recruiting and/or hiring employees. Indeed about 79% of employers that use AI at all use it specifically to support HR activities. While the practice is widespread, it's important to keep in mind that companies may adopt automation and AI in a wide variety of ways when it comes to hiring, some of which are much more extensive than others.

AI programs are capable of assisting with—or completely taking over, in some cases—everything from recruiting to interviewing to onboarding new employees. AI programs can scan through troves of resumes or LinkedIn profiles to source potential candidates for a job, sending along personalized messages to attempt to recruit top targets. These tools can act as chatbots to smooth the application process and answer questions from applicants. They can evaluate application materials and make recommendations for people to advance to the next steps in the hiring process. AI programs can even schedule and assist with the interviewing and negotiating processes and assist HR in writing layoff notices. Unfortunately, bias may be found in any of these areas, although some remain largely theoretical for the time being.

Bias in AI Recruitment

In 2018, Amazon scrapped a tool that it had developed over a period of several years to help automate its employee search process by reviewing applicant resumes. The model, which had been trained on a set of resumes submitted to Amazon over a 10-year period, displayed bias against non-male applicants. One of the likely reasons for this bias was the data set itself—most applications in the data pool were from male applicants, leading the AI model to "learn" that male candidates were preferable. The model indeed rated applications lower when they included words like "women's" or made reference to all-women's colleges. Despite the company's efforts to address these issues, it ultimately decided to abandon the project entirely. Even in recent years, Amazon's efforts to incorporate AI into other projects—including as part of a set of facial recognition tools designed to aid law enforcement and related agencies—have met backlash for allegations of inherent bias.

In 2018, Amazon scrapped a tool that displayed bias against non-male applicants

Even AI systems primed for the potential to have bias against non-male job applicants may have a difficult time maintaining neutrality. Research has shown that women frequently downplay their skills on resumes, while men are more likely to exaggerate theirs. Similar biases can emerge relating to race, age, disability, and much more. As the list of screening and pre-screening tools like Freshworks, Breezy HR, Greenhouse, and Zoho Recruits continues to grow, so too does the potential for bias.

Other Types of AI Hiring Bias

AI bias in hiring can take many other forms as well. AI tools such as HireVuew aim to use applicant computer and cellphone cameras to analyze facial movements, speaking voice, and other parameters to create what it calls an "employability" score. Detractors of this type of practice say it is rife with potential for bias against a wide range of applicants, including non-native speakers, people suffering from speech impediments or other medical issues impacting speech and movement, and more.

Another company developing an AI tool for hiring, Sapia (previously known as PredictiveHire), has used a chatbot to ask candidates questions. Based on responses, it provides an assessment of traits such as "drive" and "resilience." Again, detractors have said that this type of tool, which also seeks to estimate an applicant's likelihood of "job hopping" between positions, may hold biases against some candidates.

Other types of AI tools used in hiring practices may approach the pseudoscience known as phrenology, which claimed to link skull patterns to different personality characteristics. These include some facial recognition services which may be inclined to mischaracterize certain applicants in biased ways. A 2018 study from the University of Maryland, for example, found that Face++ and Microsoft's Face API, two such facial recognition tools, tended to interpret Black applicants as having more negative emotions than white counterparts. HireVue discontinued its practice of facial analysis in early 2020 following a complaint made with the Federal Trade Commission by the Electronic Privacy Information Center.

A 2017 study found that deep neural networks were consistently more accurate than humans when it comes to accurately detecting sexual orientation based on facial images. Other AI tools like DeepGestalt can accurately predict certain genetic diseases based on facial images. These types of capabilities could potentially lead to bias in recruiting and hiring for employment, either intentionally or otherwise.

What Is Being Done

Many AI developers and companies utilizing AI in their hiring processes are working to ensure that biases are eliminated as completely as is possible. Fortunately, there are also outside efforts to monitor and regulate how AI is used in hiring.

In 2021, the U.S. Equal Employment Opportunity Commission launched an initiative aiming to monitor how AI was used in employment decisions and to enforce compliance with civil rights laws. Former attorney general of D.C. Karl Racine announced a bill aiming to ban algorithmic discrimination at the end of 2021, while senators from Oregon, New Jersey, and New York introduced the Algorithmic Accountability Act of 2022 with similar aims. The latter bill stipulated impact assessments to determine whether AI might suffer from bias and other issues. The 2022 bill failed early in 2023. More recently, a New York City law aiming to address AI discrimination in employment practices went into effect in mid-2023.

Even if regulation is slow to catch up to some of the dangers and risks inherent in AI used for hiring purposes, businesses may be inclined to make adjustments on their own if it becomes clear that such tools could pose a threat. For example, if using a particular AI tool may open up a company to the possibility of discrimination suits or other legal trouble, that company may be less likely to adopt that practice.

Fortunately, job applicants may be able to work to overcome some of these issues as well. Companies using resume-scanning tools are likely to search for keywords matching the language from the job description. This means that resumes incorporating action-focused words drawn from the job posting itself may be at an advantage. Applicants may even give themselves a leg up by simplifying the format of the resume itself and submitting a common file type, both of which may be easier for AI tools to scan.

Stay on top of crypto news, get daily updates in your inbox.

Source: Decrypt.co


Unsubscribe to no longer receive posts from Crypto Breaking News.
Change your email settings at manage subscriptions.

Trouble clicking? Copy and paste this URL into your browser:
https://www.cryptobreaking.com/how-ai-could-lead-to-discriminatory-hiring-practices/

WordPress.com and Jetpack Logos

Get the Jetpack app to use Reader anywhere, anytime

Follow your favorite sites, save posts to read later, and get real-time notifications for likes and comments.

Download Jetpack on Google Play Download Jetpack from the App Store
WordPress.com on Twitter WordPress.com on Facebook WordPress.com on Instagram WordPress.com on YouTube
WordPress.com Logo and Wordmark title=

Automattic, Inc. - 60 29th St. #343, San Francisco, CA 94110  

at August 24, 2023
Email ThisBlogThis!Share to XShare to FacebookShare to Pinterest

No comments:

Post a Comment

Newer Post Older Post Home
Subscribe to: Post Comments (Atom)

Capping off 2025 with new Gen Z report, big team announcement – The Nerve

We have a couple of big announcements to cap the year   17 December 2025 View in Browser     Dear reader,    We have a couple of big ann...

  • [New post] Achieve Data Sovereignty through Omnisphere
    Crypto Breaking News posted: "Web 3.0 is one of the biggest buzzwords flying around the world of social media this year. An...
  • [New post] Tuesday’s politics thread is trying to stay positive.
    SheleetaHam posted: " Even though I just finished the latest Opening Arguments podcast about how Roe v. Wade is toast, and ...
  • [New post] Is XRP going to take the Crypto market by storm
    admin posted: "Is XRP going to take the Crypto market by storm While the SEC has been going after Ripple in court the XRP b...

Search This Blog

  • Home

About Me

Daily Newsletters PH
View my complete profile

Report Abuse

Labels

  • Last Minute Online News

Blog Archive

  • December 2025 (7)
  • November 2025 (4)
  • October 2025 (2)
  • September 2025 (1)
  • August 2025 (2)
  • July 2025 (5)
  • June 2025 (3)
  • May 2025 (2)
  • April 2025 (2)
  • February 2025 (2)
  • December 2024 (1)
  • October 2024 (2)
  • September 2024 (1459)
  • August 2024 (1360)
  • July 2024 (1614)
  • June 2024 (1394)
  • May 2024 (1376)
  • April 2024 (1440)
  • March 2024 (1688)
  • February 2024 (2833)
  • January 2024 (3130)
  • December 2023 (3057)
  • November 2023 (2826)
  • October 2023 (2228)
  • September 2023 (2118)
  • August 2023 (2611)
  • July 2023 (2736)
  • June 2023 (2844)
  • May 2023 (2749)
  • April 2023 (2407)
  • March 2023 (2810)
  • February 2023 (2508)
  • January 2023 (3052)
  • December 2022 (2844)
  • November 2022 (2673)
  • October 2022 (2196)
  • September 2022 (1973)
  • August 2022 (2306)
  • July 2022 (2294)
  • June 2022 (2363)
  • May 2022 (2299)
  • April 2022 (2233)
  • March 2022 (1993)
  • February 2022 (1358)
  • January 2022 (1323)
  • December 2021 (2064)
  • November 2021 (3141)
  • October 2021 (3240)
  • September 2021 (3135)
  • August 2021 (1782)
  • May 2021 (136)
  • April 2021 (294)
Simple theme. Powered by Blogger.