1 of 15

CSE 163

Fairness�

Suh Young Choi�

2 of 15

Announcements

Project Proposals due tonight, 11:59 pm

  • 12:15 am buffer window on Gradescope for technical issues
  • No late submissions or resubmissions for this

Checkpoint 6 due tonight

Class asynchronous today and Wednesday; back in the classroom on Friday!

  • Lessons 21 and 22 have no due dates; these are free lesson completions for everyone

2

3 of 15

Checking in with the Check-ins

Resubmissions

  • Two more cycles guaranteed
    • 8/2 – 8/8
    • 8/9 – 8/15
  • If 60% of the class fills out the course eval by 8/15, then there will be a bonus resubmission period from 8/16 to 8/18
    • You can resubmit any ONE take-home assessment and any ONE learning reflection!

Projects and Pacing

  • TA mentors can help you figure out what is reasonable to complete
  • It’s OK if your work doesn’t meet your expectations—show us how far you got!

3

4 of 15

This Time

  • Defining Fairness
  • Quantifying error rates
  • Worldviews

Last Time

  • Machine learning with images
  • Ethics of ML

4

5 of 15

Group Fairness

Intent: Avoid discrimination against a particular group, as to avoid membership in the group negatively impact outcomes for people in that group.

  • Does not say which groups to protect, that’s a decision of policy and social norms.
  • Can be extended to notions of belonging to multiple identities (e.g., intersectionality), but we focus on protecting a single property at this time

Usually defined in terms of mistakes the system might make

5

6 of 15

Definitions of Fairness

Equality of False Negatives (equal opportunity): False negative rate should be similar across groups

6

* Many others exist, many are in the form of equations on this confusion matrix! There are other notions of fairness too!

College admission example: P = Successful in college, N = Not successful in college

7 of 15

Definitions of Fairness

Equality of False Positives (predictive equality): False positive rate should be similar across groups

7

* Many others exist, many are in the form of equations on this confusion matrix! There are other notions of fairness too!

College admission example: P = Successful in college, N = Not successful in college

8 of 15

Human Choice

There is no one “right” definition for fairness. They are all valid and are simply statements of what you believe fairness means in your system.

It’s possible for definitions of fairness to contradict each other, so it’s important that you pick the one that reflects your values.

Emphasizes the role of people in the process of fixing bias in ML algorithms.

8

9 of 15

Tradeoff Between Fairness and Accuracy

We can’t get fairness for free, generally finding a more fair model will yield to one that is less accurate.

  • Intuition: We saw lots of examples where bias was a byproduct of an “accurate” model since that model was not trained with fairness in mind.

Can quantify this tradeoff with Pareto Frontiers

9

10 of 15

Pareto Frontiers

10

11 of 15

Fairness Worldviews

Example: College admissions

We want to measure abstract qualities about a person (e.g., intelligence or grit), but real life measurements may or may not measure abstract qualities well.

Only have access to Observed Space and we hope it’s a good representation of the Construct Space.

11

12 of 15

Worldview 1: WYSIWYG

Worldview 1: What You See is What You Get (WYSIWYG)

  • Assumes the Observed Space is a good representation of the Construct Space.

Under this worldview, can guarantee individual fairness. Individual fairness says if two people are close in the Construct Space, they should receive similar outcomes.

  • Easy to verify under WYSIWYG since you can use the Observed Space as a good representation of the Construct Space.

12

13 of 15

Worldview 2: Structural Bias + WAE

Worldview 2: Structural Bias and We’re All Equal (WAE)

  • Assumes systematic or social systems make different groups that look similar in the Construct Space look more different in the Observed Space.
  • Example: SAT Scores for one group might be artificially high due to better ability to afford SAT prep. Factors outside of qualities of interest now affect our measurements. So we assume any observed differences between groups are systematic factors, rather than inherent factors since WAE.

Goal in this worldview is to ensure non-discrimination so that someone isn’t negatively impacted by simply being a member of a particular group.

  • This is the implicit assumption we we were making when discussing notions of group fairness earlier

13

14 of 15

Contrasting Worldviews

Unfortunately there is no way to tell which worldview is right for a given problem (no access to Construct Space). The worldview is a statement of beliefs.

WYSIWYG can promise individual fairness but methods of non-discrimination will be individually unfair under this worldview.

Structural Bias + WAE can promise non-discrimination. Methods of individual fairness will lead to discrimination (since using biased data as our proxy for closeness will lead to a skewed notion of individually fair).

14

15 of 15

Before Next Time

  • Read through Lesson 22
  • CP6 due tonight
  • Proposals due tonight
  • Resub cycle closes tomorrow (last day to resubmit HW3)

Next Time

  • Ethics, cont’d
  • Privacy

15