1 of 21

Tackling Algorithmic�Disability Discrimination�in the Hiring Process:� An Ethical, Legal and Technical Analysis

Maarten Buyl*, Christina Cociancig*, Cristina Frattone*, Nele Roekens*

ACM FAccT 2022

Seoul 서울, June 21-24

* equal contribution

2 of 21

Background

  1. Algorithmic discrimination�Unjust discriminatory manner in which automated decisions impact members of a certain protected group/minority

ACM FAccT 2022

2

Seoul 서울, June 21-24

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

adobestock.com

3 of 21

Background

  1. Algorithmic discrimination�Unjust discriminatory manner in which automated decisions impact members of a certain protected group/minority
  2. AI-driven automated hiring systems (AHSs)�e.g. CV screening, job interview analysis, etc.

ACM FAccT 2022

3

Seoul 서울, June 21-24

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

adobestock.com

4 of 21

Background

  1. Algorithmic discrimination�Unjust discriminatory manner in which automated decisions impact members of a certain protected group/minority
  2. AI-driven automated hiring systems (AHSs)�e.g. CV screening, job interview analysis, etc.
  3. Focus on candidates with disabilities (PWDs)
  4. Understudied in the literature on AHSs
  5. Specific ethical, legal, and technical issues

ACM FAccT 2022

4

Seoul 서울, June 21-24

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

adobestock.com

5 of 21

5

“people with disabilities are on average ~41% less likely to receive a positive response to a job application”

6 of 21

Ethics of non-discrimination

ACM FAccT 2022

6

Seoul 서울, June 21-24

Moral obligation

  1. Discrimination against PWDs is distinct compared to other groups:
      • Separation and segregation
      • People in direct surroundings seldomly share disability
      • Physical toll and economic cost
  2. Anti-discrimination law does not suffice to protect PWDs

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

7 of 21

Ethics of non-discrimination

ACM FAccT 2022

7

Seoul 서울, June 21-24

Moral obligation

Information

  1. Discrimination against PWDs is distinct compared to other groups:
      • Separation and segregation
      • People in direct surroundings seldomly share disability
      • Physical toll and economic cost
  2. Anti-discrimination law does not suffice to protect PWDs
  1. Disclosure of a disability requires trust
  2. Employers are not aware of legal possibilities for offering reasonable accommodation

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

8 of 21

AI Ethics

ACM FAccT 2022

8

Seoul 서울, June 21-24

Area of research fails PWDs

  1. AI aims to generalize “the norm” → PWDs cannot participate in the datafication that is required
  2. PWDs are exposed to digital discrimination, e.g. failing voice recognition
  3. PWDs are almost exclusively mentioned as therapeutic users of AI

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

9 of 21

AI Ethics

ACM FAccT 2022

9

Seoul 서울, June 21-24

Area of research fails PWDs

  1. AI aims to generalize “the norm” → PWDs cannot participate in the datafication that is required
  2. PWDs are exposed to digital discrimination, e.g. failing voice recognition
  3. PWDs are almost exclusively mentioned as therapeutic users of AI

Quantitative literary study of 82 AI Ethics guidelines shows:

      • Only 27% of guidelines even mention “people with disabilities” or similar phrases
      • None of them acknowledge the opportunity to actively involve PWDs to progress ethical principles in AI

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

10 of 21

AI Ethics

ACM FAccT 2022

10

Seoul 서울, June 21-24

Area of research fails PWDs

  1. AI aims to generalize “the norm” → PWDs cannot participate in the datafication that is required
  2. PWDs are exposed to digital discrimination, e.g. failing voice recognition
  3. PWDs are almost exclusively mentioned as therapeutic users of AI

Quantitative literary study of 82 AI Ethics guidelines shows:

      • Only 27% of guidelines even mention “people with disabilities” or similar phrases
      • None of them acknowledge the opportunity to actively involve PWDs to progress ethical principles in AI

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

Understanding of discrimination is consequentialist

  1. EU Law considers the intent to discriminate irrelevant → consequentialist
  2. Consequentialist theory can help re-calibrate our moral compass to AI ethics

11 of 21

EU Law: Employment Equality Directive (EED), General Data Protection Regulation (GDPR), AI Act Proposal (AIA)

ACM FAccT 2022

11

Seoul 서울, June 21-24

Risk of unlawful discrimination

  1. Right reasonable accommodation applies to the entire hiring process (3(a), 5 EED)
    • Note: Violation of right to reasonable accommodation = discrimination
  2. Presumption of discrimination reversed burden of proof (10 EED)

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

12 of 21

EU Law: Employment Equality Directive (EED), General Data Protection Regulation (GDPR), AI Act Proposal (AIA)

ACM FAccT 2022

12

Seoul 서울, June 21-24

  1. Some existing exceptions
  2. for obligations in employment (9(2)(b) GDPR)
  3. for public interest (9(2)(g) GDPR)
  4. for debiasing (10(5) AIA)
  5. Impact of non-personal data (e.g., proxies, anonymised data) → legal gap

Risk of unlawful discrimination

Processing of disability data

  1. Right reasonable accommodation applies to the entire hiring process (3(a), 5 EED)
    • Note: Violation of right to reasonable accommodation = discrimination
  2. Presumption of discrimination reversed burden of proof (10 EED)

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

13 of 21

EU Law: Practical recommendations

ACM FAccT 2022

13

Seoul 서울, June 21-24

Partial automation

AHSs shall not be fully automated (22 GDPR)

  • Decisive role of humans (even though they can be biased as well…)

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

14 of 21

EU Law: Practical recommendations

ACM FAccT 2022

14

Seoul 서울, June 21-24

Partial automation

Rigorous DPIA

AHSs shall not be fully automated (22 GDPR)

  • Decisive role of humans (even though they can be biased as well…)

Mandatory data protection impact assessment (35 GDPR)

  • Identify risks for PWDs and implement reasonable accommodation

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

15 of 21

EU Law: Practical recommendations

ACM FAccT 2022

15

Seoul 서울, June 21-24

New obligations (for providers), e.g.:

    • Debiasing (Art. 10(5))
    • Human supervisor (Art. 14) can decide whether to use the AHS in the first place, or disregard its output considering the risk of bias against PWDs
      • EU central database (Art. 60) can assist advocacy bodies and national authorities in detecting discriminatory AHSs

Partial automation

Rigorous DPIA

Impact of the AIA: AHSs as high risk AIs

AHSs shall not be fully automated (22 GDPR)

  • Decisive role of humans (even though they can be biased as well…)

Mandatory data protection impact assessment (35 GDPR)

  • Identify risks for PWDs and implement reasonable accommodation

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

16 of 21

EU Law: Practical recommendations

ACM FAccT 2022

16

Seoul 서울, June 21-24

New obligations (for providers), e.g.:

    • Debiasing (Art. 10(5))
    • Human supervisor (Art. 14) can decide whether to use the AHS in the first place, or disregard its output considering the risk of bias against PWDs
      • EU central database (Art. 60) can assist advocacy bodies and national authorities in detecting discriminatory AHSs

However…

  • Naïve assumptions: e.g., data should be ‘free of error’ (Art. 10(5))

→ impossible thus ineffective

  • Lack of remedies for affected individuals (e.g., job seekers subject to AHSs)

Partial automation

Rigorous DPIA

Impact of the AIA: AHSs as high risk AIs

AHSs shall not be fully automated (22 GDPR)

  • Decisive role of humans (even though they can be biased as well…)

Mandatory data protection impact assessment (35 GDPR)

  • Identify risks for PWDs and implement reasonable accommodation

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

17 of 21

Technical understanding of disability

ACM FAccT 2022

17

Seoul 서울, June 21-24

  1. People with disabilities are a heterogeneous group, e.g.
    • Require more time to complete a test
    • Avoid eye contact during interview
    • Gap in résumé from long-term illness
  2. homogeneous treatment of discrimination for other protected groups

Heterogeneity

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

18 of 21

Technical understanding of disability

ACM FAccT 2022

18

Seoul 서울, June 21-24

  1. People with disabilities are a heterogeneous group, e.g.
    • Require more time to complete a test
    • Avoid eye contact during interview
    • Gap in résumé from long-term illness
  2. homogeneous treatment of discrimination for other protected groups

  • Construct technical model of the limitations that may result from a disability
    • Are there any limitations we should consider?
    • To which extent do they actually matter for the assessment?
    • How can we accommodate them?
  • Risks of poor technical model
    • Too vague to be relevant
    • Too rigid to allow for nuance
    • Ethical concerns with standardisation

Heterogeneity

Modeling disability

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

19 of 21

Providing reasonable accommodation

ACM FAccT 2022

19

Seoul 서울, June 21-24

  1. Automated adjustment of decisions is difficult due to heterogeneity�e.g. depression
  2. Human intervention is costly and not necessarily better
  3. Reasonableness is hard to estimate and not clearly defined
    • How much investment should be made?
    • How much does a limitation matter during assessment?

Obstacles

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

20 of 21

Providing reasonable accommodation

ACM FAccT 2022

20

Seoul 서울, June 21-24

  1. Automated adjustment of decisions is difficult due to heterogeneity�e.g. depression
  2. Human intervention is costly and not necessarily better
  3. Reasonableness is hard to estimate and not clearly defined
    • How much investment should be made?
    • How much does a limitation matter during assessment?

  • Recommend accommodations to human recruiter
  • Audit AI system for patterns of bias

Obstacles

Opportunities

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis

21 of 21

ACM FAccT 2022

21

Seoul 서울, June 21-24

Maarten Buyl

Ph.D. Candidate in AI

maarten.buyl@ugent.be

Christina Cociancig

Ph.D. Candidate in AI Ethics

chrcoc@uni-bremen.de

Cristina Frattone

Ph.D. Candidate in EU Private Law

cristina.frattone@uniroma3.it

Nele Roekens

Legal Officer at Unia

nele.roekens@unia.be

Tackling Algorithmic Disability Discrimination in the Hiring Process: An Ethical, Legal and Technical Analysis