CSE 163
Fairness
��Wen Qiu
🎵Music: Ed Sheeran (sorry, not really cuz technically difficulty)
Group Fairness
Intent: Avoid discrimination against a particular group, as to avoid membership in the group negatively impact outcomes for people in that group.
Usually defined in terms of mistakes the system might make
2
Definitions of Fairness
Equality of False Negatives (equal opportunity): False negative rate should be similar across groups
3
* Many others exist, many are in the form of equations on this confusion matrix! There are other notions of fairness too!
College admission example: P = Successful in college, N = Not successful in college
Definitions of Fairness
Equality of False Positives (predictive equality): False positive rate should be similar across groups
4
* Many others exist, many are in the form of equations on this confusion matrix! There are other notions of fairness too!
College admission example: P = Successful in college, N = Not successful in college
Human Choice
There is no one “right” definition for fairness. They are all valid and are simply statements of what you believe fairness means in your system.
It’s possible for definitions of fairness to contradict each other, so it’s important that you pick the one that reflects your values.
Emphasizes the role of people in the process of fixing bias in ML algorithms.
5
Tradeoff Between Fairness and Accuracy
We can’t get fairness for free, generally finding a more fair model will yield to one that is less accurate.
Can quantify this tradeoff with Pareto Frontiers
6
Pareto Frontiers
7
Fairness Worldviews
Example: College admissions
We want to measure abstract qualities about a person (e.g., intelligence or grit), but real life measurements may or may not measure abstract qualities well.
Only have access to Observed Space and we hope it’s a good representation of the Construct Space.
8
Worldview 1: WYSIWYG
Worldview 1: What You See is What You Get (WYSIWYG)
Under this worldview, can guarantee individual fairness. Individual fairness says if two people are close in the Construct Space, they should receive similar outcomes.
9
Worldview 2: Structural Bias + WAE
Worldview 2: Structural Bias and We’re All Equal (WAE)
Goal in this worldview is to ensure non-descrimination so that someone isn’t negatively impacted by simply being a member of a particular group.
10
Contrasting Worldviews
Unfortunately there is no way to tell which worldview is right for a given problem (no access to Construct Space). The worldview is a statement of beliefs.
WYSIWYG can promise individual fairness but methods of non-discrimination will be individually unfair under this worldview.
Structural Bias + WAE can promise non-discrimination. Methods of individual fairness will lead to discrimination (since using biased data as our proxy for closeness will lead to a skewed notion of individually fair).
11
Group Work:
Best Practices
When you first working with this group:
Tips:
12