1 of 12

CSE 163

Fairness

��Wen Qiu

🎵Music: Ed Sheeran (sorry, not really cuz technically difficulty)

2 of 12

Group Fairness

Intent: Avoid discrimination against a particular group, as to avoid membership in the group negatively impact outcomes for people in that group.

  • Does not say which groups to protect, that’s a decision of policy and social norms.
  • Can be extended to notions of belonging to multiple identities (e.g., intersectionality), but we focus on protecting a single property at this time

Usually defined in terms of mistakes the system might make

2

3 of 12

Definitions of Fairness

Equality of False Negatives (equal opportunity): False negative rate should be similar across groups

3

* Many others exist, many are in the form of equations on this confusion matrix! There are other notions of fairness too!

College admission example: P = Successful in college, N = Not successful in college

4 of 12

Definitions of Fairness

Equality of False Positives (predictive equality): False positive rate should be similar across groups

4

* Many others exist, many are in the form of equations on this confusion matrix! There are other notions of fairness too!

College admission example: P = Successful in college, N = Not successful in college

5 of 12

Human Choice

There is no one “right” definition for fairness. They are all valid and are simply statements of what you believe fairness means in your system.

It’s possible for definitions of fairness to contradict each other, so it’s important that you pick the one that reflects your values.

Emphasizes the role of people in the process of fixing bias in ML algorithms.

5

6 of 12

Tradeoff Between Fairness and Accuracy

We can’t get fairness for free, generally finding a more fair model will yield to one that is less accurate.

  • Intuition: We saw lots of examples where bias was a byproduct of an “accurate” model since that model was not trained with fairness in mind.

Can quantify this tradeoff with Pareto Frontiers

6

7 of 12

Pareto Frontiers

7

8 of 12

Fairness Worldviews

Example: College admissions

We want to measure abstract qualities about a person (e.g., intelligence or grit), but real life measurements may or may not measure abstract qualities well.

Only have access to Observed Space and we hope it’s a good representation of the Construct Space.

8

9 of 12

Worldview 1: WYSIWYG

Worldview 1: What You See is What You Get (WYSIWYG)

  • Assumes the Observed Space is a good representation of the Construct Space.

Under this worldview, can guarantee individual fairness. Individual fairness says if two people are close in the Construct Space, they should receive similar outcomes.

  • Easy to verify under WYSIWYG since you can use the Observed Space as a good representation of the Construct Space.

9

10 of 12

Worldview 2: Structural Bias + WAE

Worldview 2: Structural Bias and We’re All Equal (WAE)

  • Assumes systematic or social systems make different groups that look similar in the Construct Space look more different in the Observed Space.
  • Example: SAT Scores for one group might be artificially high due to better ability to afford SAT prep. Factors outside of qualities of interest now affect our measurements. So we assume any observed differences between groups are systematic factors, rather than inherent factors since WAE.

Goal in this worldview is to ensure non-descrimination so that someone isn’t negatively impacted by simply being a member of a particular group.

  • This is the implicit assumption we we were making when discussing notions of group fairness earlier

10

11 of 12

Contrasting Worldviews

Unfortunately there is no way to tell which worldview is right for a given problem (no access to Construct Space). The worldview is a statement of beliefs.

WYSIWYG can promise individual fairness but methods of non-discrimination will be individually unfair under this worldview.

Structural Bias + WAE can promise non-discrimination. Methods of individual fairness will lead to discrimination (since using biased data as our proxy for closeness will lead to a skewed notion of individually fair).

11

12 of 12

Group Work:

Best Practices

When you first working with this group:

  • Introduce yourself!
  • If possible, angle one of your screens so that everyone can discuss together

Tips:

  • Starts with making sure everyone agrees to work on the same problem
  • Make sure everyone gets a chance to contribute!
  • Ask if everyone agrees and periodically ask each other questions!
  • Call TAs over for help if you need any!

12