Computer Vision, Society, and Ethics
CS5670: Computer Vision
Closed circuit TV monitoring at the Central Police Control Station, Munich, 1973
(from the Wikipedia article on the Panopticon)
Announcements
Additional Resources
Advances in computer vision
Today
Questions to ask about a specific task
More questions
Slide credit: Bharath Hariharan
Bias in computer vision and beyond
Shirley cards
How Kodak's Shirley Cards Set Photography's Skin-Tone Standard
The Racial Bias Built Into Photography
https://www.nytimes.com/2019/04/25/lens/sarah-lewis-racial-bias-photography.html
Kodak’s Multiracial Shirley Card, North America. 1995.
Example Kodak Shirley Card, 1950s and beyond
Face recognition
Google Photos automatic face clustering and recognition
Face analysis
Gender Shades – Evaluation of bias in Gender Classification
Joy Buolamwini and Timnit Gebru. Gender shades: Intersectional accuracy disparities in commercial gender classification. Conference on Fairness, Accountability and Transparency. 2018.
Images from the Pilot Parliaments Benchmark
Joy Buolamwini and Timnit Gebru. Gender shades: Intersectional accuracy disparities in commercial gender classification. Conference on Fairness, Accountability and Transparency. 2018.
Case study – upsampling faces
PULSE: Self-Supervised Photo Upsampling via Latent Space Exploration of Generative Models
Sachit Menon, Alexandru Damian, Shijia Hu, Nikhil Ravi, and Cynthia Rudin
Case study – upsampling faces
Case study – upsampling faces
“We have noticed a lot of concern that PULSE will be used to identify individuals whose faces have been blurred out. We want to emphasize that this is impossible - PULSE makes imaginary faces of people who do not exist, which should not be confused for real people. It will not help identify or reconstruct the original image.
We also want to address concerns of bias in PULSE. We have now included a new section in the paper and an accompanying model card directly addressing this bias.”
https://github.com/tg-bomze/Face-Depixelizer, accessed May 4, 2021
Case study – classifying sexual orientation
“We show that faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain… Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. … Consistent with the prenatal hormone theory of sexual orientation, gay men and women tended to have gender-atypical facial morphology, expression, and grooming styles … our findings expose a threat to the privacy and safety of gay men and women.”
Wang & Kosinski 2017
More questions
Slide credit: Bharath Hariharan
Answers
Slide credit: Bharath Hariharan
Answers
Slide credit: Bharath Hariharan
Do algorithms reveal sexual orientation or just expose our stereotypes?
Blaise Agüera y Arcas, Alexander Todorov and Margaret Mitchell
Datasets – Potential Issues
Case study – ImageNet
Case study – Microsoft Celeb
Case study – Microsoft Celeb
Case study – LAION-5B
Some “sunsetted” datasets
Datasheets for Datasets
“The ML community currently has no standardized process for documenting datasets, which can lead to severe consequences in high-stakes domains. To address this gap, we propose datasheets for datasets. In the electronics industry, every component, no matter how simple or complex, is accompanied with a datasheet that describes its operating characteristics, test results, recommended uses, and other information. By analogy, we propose that every dataset be accompanied with a datasheet that documents its motivation, composition, collection process, recommended uses, and so on.”
Timnit Gebru, Jamie Morgenstern, Briana Vecchione, Jennifer Wortman Vaughan, Hanna Wallach, Hal Daumé III, Kate Crawford. 2018
DeepFakes
DeepFakes
DeepFakes
Text-to-image models
Example Text-to-image prompt: “Wizard with sword and a glowing orb of magic fire fights a fierce dragon Greg Rutkowski,”
“Dragon Cave”
GREG RUTKOWSKI
Generated images of lawyers
Generated images of flight attendants
New York Times, February 22, 2024
Some tools
Questions?