All Collections
Employer FAQs
Artificial Intelligence, Machine Learning, and Bias Mitigation FAQ
Artificial Intelligence, Machine Learning, Bias Mitigation & Compliance Primer
Artificial Intelligence, Machine Learning, Bias Mitigation & Compliance Primer
Magx D avatar
Written by Magx D
Updated over a week ago

RippleMatch is committed to ensuring our products are free of bias and treat candidates with equal dignity and respect. This page provides some answers to common questions about how the RippleMatch platform functions and how RippleMatch works to minimize the potential for bias in our technology.

List of Topics Covered

Does RippleMatch use Artificial Intelligence (AI), or Machine Learning (ML)?

Yes, for some. RippleMatch uses AI/ML to help suggest opportunities to job seekers who are likely to apply, but does not use AI/ML to assess or evaluate applicants. We provide more detail on this in the next section.

How does RippleMatch work from a technical standpoint?

RippleMatch uses two methods to match candidates to roles:

  1. RippleMatch helps job seekers discover opportunities they are likely to be a good match for.

    1. For each job posted on RippleMatch, employers select the skills and experiences they are looking for. RippleMatch uses these preferences to automatically market jobs to candidates who are likely to be a good fit and interested in the position.

    2. Over time, RippleMatch learns which job seekers are likely to apply to certain opportunities. This use of Machine Learning (ML) helps us provide more qualified applicants for employers and suggest relevant jobs to job seekers.

  2. RippleMatch helps employers prioritize responding to their best applicants.

    1. After a candidate applies for a job, RippleMatch calculates a Fit Score based on the candidate’s overall match for the job. Employers select the criteria (skills and experiences) that determine a candidate’s fit score for a job and may change these criteria at any time.

    2. Separately, RippleMatch will flag candidates who don’t meet a minimum requirement determined by the employer like GPA Minimum or Graduation Date.

    3. RippleMatch does not use AI or ML to calculate Fit Scores or to flag candidates who don’t meet minimum role requirements. Fit Scores and unqualified settings can only be affected if an employer decides to adjust their desired criteria for the role.

    4. Fit scores are never influenced by how an applicant identifies across race, ethnicity, gender, or any other protected group.

Does RippleMatch have the potential for bias?

RippleMatch is committed to ensuring our products limit bias to the greatest extent possible and treat candidates with equal dignity and respect. To ensure these outcomes, we partner annually with a third party bias auditor babl.ai to review our technology and internal processes. As of April 15, 2023, New York Local Law 144 requires many companies using software in hiring to complete annual bias audits.

Our latest bias audit was completed by babl.ai on January 5, 2023. Using data from 170,000+ applications, the auditors investigated whether candidates experienced disparate impact based on race and gender. RippleMatch passed the audit unanimously and with no exceptions. The auditors noted, “The requirements of our audit go significantly beyond what is required by the NYC law; RippleMatch chose to hold themselves to a higher standard.” Full copies of the bias audit are available upon request.

What is the new New York City law about AI and automated assessments in hiring?

On Dec 10, 2021, New York City Mayor Bill de Blasio allowed legislation passed by the City Council to become law that regulates the use of AI in hiring for companies looking to hire candidates who are residents of New York City. This law takes effect on January 1, 2023. After an initial delay, the law will take effect on April 15, 2023.

What does the new law require?

While the law is vague in its language and may be amended before the implementation date (something we are monitoring closely), the main requirement is that employers will be prohibited from using an AI-type tool to screen job candidates or evaluate employees unless the technology has been audited for bias no more than one year before its use and a summary of the audit's results has been made publicly available on the employer's website.

How will RippleMatch comply with the new law?

RippleMatch is committed to ensuring our products limit bias to the greatest extent possible and treat candidates with equal dignity and respect. To ensure these outcomes and full compliance with the law, we partner annually with a third party bias auditor babl.ai to review our technology and internal processes.

Our latest bias audit was completed by babl.ai on January 5, 2023. Using data from 170,000+ applications, the auditors investigated whether candidates experienced disparate impact based on race and gender. RippleMatch passed the audit unanimously and with no exceptions.

What does this mean for my organization?

RippleMatch will handle everything to ensure our customers are compliant with the new NYC law in their usage of our platform. However, if you are using or evaluating other vendors who utilize AI or automated decision-making in their technology, you should ask them about their plan to be compliant before April 15, 2023 or you could face financial penalties of up to $500 for a first violation and between $500 and $1,500 daily for each subsequent violation.

Further Reading & Resources

Did this answer your question?