1 of 39

Privacy

January 31, 2023

CS 195, Spring 2023 @ UC Berkeley

Lisa Yan https://eecs.link/cs195

1

LECTURE 03

2 of 39

Privacy and Universities

Privacy and Universities

Quick Ethics Primer

“Practical” Privacy

Policy: GDPR and CCPA

Apple and Privacy

2

3 of 39

Capture Higher Ed: Redefining Higher Education Marketing

3

4 of 39

“Student tracking, secret scores: How college admissions offices rank prospects before they apply

Quotes from the article:

  • Each visitor to the university site gets a cookie.
  • Every time that person returns to the site, Capture learns more information about them, such as their interest in athletics or the amount of time they spend on financial aid page.
  • Initially, the cookies identify each visitor by the IP address, a unique code associated with a computer’s Internet connection, but Capture also offers software tools to match the cookie data with people’s real identities.”
  • Colleges do this by sending marketing emails to thousands of prospective students, inviting them to click on a hyperlink inside the message for more information about a particular topic, according to the videos.
  • When a student clicks on the link, Capture learns which email address is associated with which IP address, connecting the student’s real identity to the college’s snapshot of the student’s Web browsing history.

4

5 of 39

Poll Everywhere

Even before Coronavirus arrived, universities were increasingly tracking their students to help detect students at risk of harm or academic failure.

Is it ethical for universities to track the locations and activities of their students?

5

🤔

(individual)

6 of 39

6

7 of 39

“No Place to Hide: Colleges Track Students, Everywhere”

Universities also track students after they arrive. From a WSJ article:

  • It is now common for colleges and universities to be able to locate students through their phones, with scores of schools hiring private companies to actively monitor everywhere students go on campus.
  • These schools want to know whether students are showing up to class, studying in the library or hanging out on the quad.
  • In a pilot program for the California State University system:
    • Sacramento State University this semester gave first-year students the option to connect through their phones to a program that tracks where they go.
    • The program, provided by Degree Analytics, can be set up to flag patterns in an individual’s behavior that might increase their risk of dropping out.
    • Of the school’s 3,800 first-year students, about 500 agreed to participate.

7

8 of 39

On Campus Tracking (1/4): Emergencies

Universities also track students after they arrive. From “No Place to Hide: Colleges Track Students, Everywhere” (Link):

  • “Tech companies are pitching new ways to track students all the time, Prof. Rubel says. Last year he reviewed a pitch from a company that wanted to access the cameras and microphones of every student’s mobile phone in case of an emergency like a mass shooting, he says. The devices would act like a hive, collecting and feeding information to a central authority which could then direct students away from danger.“

Examples:

  • Emergencies (above)
  • Sports Events monitoring
  • Social Media monitoring
  • Academic performance monitoring

8

9 of 39

On Campus Tracking (2/4): Sports Events

  • “About 40 colleges are using an app called FanMaker to boost student attendance at sports events. Sensors placed in the student-seating sections detect not only when students with the app are at the game, but how long they stay.”
  • “Such information can be valuable for universities with big athletic programs. For example, if not many students attend a football game that is nationally televised, or if they leave at halftime, the school looks less attractive to prospective students. In return for filling seats at such games, the optional app offers students rewards like a ride on a Zamboni during a hockey game…”
  • “Mr. Cole adds that one Florida school used the FanMaker app to get students to come to a women’s soccer game where a highly sought-after high-school recruit was present. The coach figured if the stands were packed, the recruit would be more likely to come, Mr. Cole says.”

9

10 of 39

On Campus Tracking (3/4): Social Media and Personal Danger

  • “Several companies have developed complex algorithms to monitor social-media feeds in and around a school for signs that a student may be a danger to themselves or someone else. It’s hard for computers to identify intent and other nuances, so there are a lot of false positives. But if language is identified that meets certain criteria, the company sends the school a notification that includes the post itself, and it’s up to the school to interpret the message.”

10

11 of 39

On Campus Tracking (3/4): Academic Performance

“Stephen Fugale, chief information officer at Villanova University for 17 years until his departure last September, says learning-management systems in use at VU show whether students watch videos or read chapters they were assigned, and whether they participated in online discussions.”

  • In classes I teach, I collect weekly survey data which includes self-reported information about whether students are keeping up with lectures.
    • You see this with our CS 195 course this semester :-)
  • At UC Berkeley, CS 61B also built a system that tracks various pieces of information about students to make it easy for a student’s GSI to identify struggling students based on grades:
    • Late assignment submissions.
    • Student scores
    • Self-reported information (weekly surveys) about being up to date with lectures.

11

12 of 39

Returning to the same question

Even before Coronavirus arrived, universities were increasingly tracking their students to help detect students at risk of harm or academic failure.

With these examples, is it ethical for universities to track the locations and activities of their students?

12

🤔

(individual)

13 of 39

13

14 of 39

Attendance Code

https://eecs.link/cs195here

  • There will be several codes distributed throughout each lecture.
  • You must submit all codes correctly to get credit.
  • Please do not ask for us to repeat the code beyond the time in class that we give you to submit.
  • You have unlimited submissions during lecture.

  • Try submitting one of the codes now.
    • Please click “Submit” despite the “You may not change your answers” prompt.
    • You should be able to view your answers after submission.

Your last submission must have all correct codes. We will not “frankenstein” submissions together.

14

Attendance closes at the end of lecture.

Please submit your bCourse assignment by end of class (5:00pm Pacific).

Attendance Code (1/2)

15 of 39

Quick Ethics Primer

Privacy and Universities

Quick Ethics Primer

“Practical” Privacy

Policy: GDPR and CCPA

Apple and Privacy

15

16 of 39

Core ethical approaches relevant to data science/computing

You need them all!

You also need policy and regulation!

16

Slide Courtesy: Charis Thompson

Consequentialist: value of what happens

JUDGE OUTCOMES

Local Moral Worlds: cultural mores, standards, and norms

HONOR COMMUNITY

Social movements

COLLECTIVE ACTION FOR SOCIAL CHANGE

Deontological: Intrinsic value of an act or intention

JUDGE ACTORS

Social Justice: countering systemic harm/ discrimination

UNDO INEQUALITY

Solve grand social challenges

SOCIAL RESPONSIBILITY/�TECHNICAL SOLUTIONS

17 of 39

Breakout

What ethical approaches particularly resonate with you? Why?

Breakout activity: 10 minutes

  • Introduce yourselves!
  • Pick one person to post a group stickynote to this Google Jamboard with your group reasoning.

https://tinyurl.com/cs195-lec03-jam-ethics

17

🤔🤔🤔

(breakout)

18 of 39

“Practical” Privacy

Privacy and Universities

Quick Ethics Primer

“Practical” Privacy

Policy: GDPR and CCPA

Apple and Privacy

18

19 of 39

Privacy matters to everyone

THREATS

  • Losing your bank information to scammers through database hacks from companies
  • Nationstates spying on protesters, journalists, and other political opponents
  • Tracking their location

PROTECTIONS

  • Companies need protection from hackers
  • Get a burner phone
  • VPNs
  • Strong passwords and 2FA
  • Password managers
  • Corporate limitations on what data they can collect
  • Tor
  • Political action to change laws

19

20 of 39

Tracking

Long ago, the web was mostly plain text and images.

  • To the right: New Yorker Cartoon (1993)
  • For reference, Javascript came out in 1995.

Internet is now ruled by a collection of powerful fiefdoms, each aggressively tracking their users.

20

21 of 39

Tracking

Facebook’s impression of your interests:

Google:

Amazon product recommendations.

�More spooky things like browser fingerprinting: http://panopticlick.eff.org

21

22 of 39

Policy: GDPR and CCPA

Privacy and Universities

Quick Ethics Primer

“Practical” Privacy

Policy: GDPR and CCPA

Apple and Privacy

22

23 of 39

GDPR: The Right to Be Forgotten

“This conception of the right to be forgotten is based on the fundamental the need of an individual to determine the development of his life in an autonomous way, without being perpetually or periodically stigmatized as a consequence of a specific action performed in the past, especially when these events occurred many years ago and do not have any relationship with the contemporary context.“

[Link]. From: The EU Proposal for a General Data Protection Regulation and the Roots of the ‘Right to Be Forgotten’

The “General Data Protection Regulation” in the EU was implemented in May 2018. Among many other things, it provides that people within the European Economic Area have a “Right to Erasure”.

  • Basically: You can request that a company actually erase all its information about you (with various exceptions, e.g. free speech, legal reporting requirements, etc).

https://www.gdpreu.org/gdpr-requirements/

23

24 of 39

GDPR Individual Rights and Business Obligations

24

[GDPR, graphic from BankingHub]

25 of 39

Breakout 2

What technical challenges do you see in implementing GDPR requirements?

Breakout activity: 5 minutes

  • Introduce yourselves!
  • Pick one person to post group stickynote(s) to this Google Jamboard with your group reasoning.

https://tinyurl.com/cs195-lec03-jam-gdpr

25

🤔🤔🤔

(breakout)

26 of 39

California Consumer Privacy Act (CCPA)

In 2018, California passed a smaller law, the California Consumer Privacy Act (CCPA) which covers California residents.

26

27 of 39

Attendance Code

https://eecs.link/cs195here

  • There will be several codes distributed throughout each lecture.
  • You must submit all codes correctly to get credit.
  • Please do not ask for us to repeat the code beyond the time in class that we give you to submit.
  • You have unlimited submissions during lecture.

  • Try submitting one of the codes now.
    • Please click “Submit” despite the “You may not change your answers” prompt.
    • You should be able to view your answers after submission.

Your last submission must have all correct codes. We will not “frankenstein” submissions together.

27

Attendance closes at the end of lecture.

Please submit your bCourse assignment by end of class (5:00pm Pacific).

Attendance Code (2/2)

28 of 39

Apple and Privacy

Privacy and Universities

Quick Ethics Primer

“Practical” Privacy

Policy: GDPR and CCPA

Apple and Privacy

28

29 of 39

Brief History of Encryption

Modern cryptography is strong enough that it cannot be broken by anyone.

  • If properly encrypted, impossible for communications to be read, altered, or modified.
  • Intense debates over whether to regulate cryptography throughout the 1990s. Ultimately, no regulation was introduced.

In 2014, as a result of Snowden disclosures, Apple updated iOS 8 so that they were physically unable to decrypt user’s phones [WaPo, 2014].

  • “There will come a day when it will matter a great deal to the lives of people . . . that we will be able to gain access” - James Comey, Former FBI Director (September 2014)

29

30 of 39

Apple vs FBI Timeline

  • 12/2/15: 14 people killed & 22 injured by Syed Farook (phone owner) & wife.
  • 2/24/16: Court orders Apple to develop a bypass of Farook's phone.
  • 2/25/16: Apple refuses and writes a letter to customers explaining why (Apple, 2016). Court hearing scheduled for 3/22/16.
    • “Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”
  • 3/21/16: FBI requests delay in case because third party may have a way to unlock phone (CBS, 2016).
  • 3/28/16: FBI announced it has unlocked the phone (ABC, 2016), withdraws case.
    • Cost for hacking tool: More than $1.3 million (NYTimes, 2016).

30

31 of 39

Apple vs FBI Timeline

Epilogue:

  • 4/22/16: Senators introduce bill that would force companies to comply with such requests; must ensure products can be decrypted. People upset. The bill dies.
  • May 2017: During testimony, FBI Director James Comey backs Diane Feinstein’s (CA Senator) desire for a decryption bill during questioning. Nothing has happened since.
  • September 2018: Australia has proposed a similar law [NYTimes, 2018] that is being closely watched by privacy advocates.
  • March 2021, The Verge:
    • “In the end, not much happened as a result of the effort. The FBI reportedly didn’t get any useful information from the phone, and the bureau never got to set a legal precedent about whether the government could compel companies to compromise the security of their devices.”

31

32 of 39

Your Alignment

32

33 of 39

Since then: Apple and CSAM (tw: references child abuse)

  • August 2021: Apple announces plan to scan iCloud photos for CSAM.
  • September 2021: Apple pauses rollout after backlash from privacy/security researchers, digital rights groups.

  • December 2021: launches “Communication Safety” features.
    • Siri, Apple’s Spotlight search, and Safari Search to warn if someone is looking at or searching for child sexual abuse materials and provide resources on the spot to report the content and seek help.
    • opt-in and analyzes image attachments users send and receive on their devices to determine whether a photo contains nudity.
    • The feature is designed so Apple never gets access to the messages, the end-to-end encryption that Messages offers is never broken, and Apple doesn’t even learn that a device has detected nudity.

33

CSAM: child sexual abuse material

34 of 39

What do you think?

What is the tradeoff between privacy and safety?

34

🤔

(individual)

35 of 39

35

36 of 39

[Extra] Data breaches and leaks

Privacy and Universities

Is Social Media Healthy?

New and Enabled Social Capacity

Sustaining Digital Tools

Making Things Better

Time Well Spent

36

37 of 39

Database Leaks

In 2016, Wikileaks released an almost completely unredacted database of information about the ruling AKP party.

  • Data included spreadsheets including “sensitive information of what appears to be every female voter in 79 out of 81 provinces in Turkey, including their home addresses and other private information.” (Link)
    • “The files also include whether or not these women were AKP members — right after a brutal and bloody coup attempt to overthrow the AKP.”
  • Author of linked article above verified validity of many of these pieces of information by asking friends to verify correctness.

37

38 of 39

Data Breaches Happen All The Time

Uber was breached to its core, purportedly by an 18-year-old Dan Goodin, Ars Technica, Sept 2022

  • Uber systems were reportedly breached by an 18 year old
  • Not (yet) known what they had access to, but did have access to the AWS, Google Admins, and intrusion detection detection systems
  • “It remains unclear what other data the hacker had access to and whether the hacker copied or shared any of it with the world at large. Uber on Friday updated its disclosure page to say: "We have no evidence that the incident involved access to sensitive user data (like trip history)."”
  • Uber did have 2FA, but using “soft tokens” which are still susceptible to social engineering.

38

39 of 39

Deanonymization

2006: AOL releases pseudo-anonymized web search results of 650,000 users over a 3-month period. Each user is given a unique ID.

  • People tend to search for things local to themselves: Friends’ names, addresses, etc.
  • New York Times is able to deanonymize users, e.g. 4417749 was actually Thelma Arnold, a 62 year-old widow, who allowed NYTimes to reveal her identity. (Link)

Other work exists to deanonymize:

  • Devices, e.g. optical sensors inside cameras. (Link)
  • Authors and message board posters based on writing style. (Link)
    • Wouldn’t be surprised if governments have very good tools for this.
  • Programmers based on coding style, even in compiled binaries! (Link)

39