1 of 78

Designing for Privacy in an Increasingly Public World

A New HOPE | 22 July 2022 | Robert Stribley

2 of 78

What do we mean by “designing for privacy”?

3 of 78

  • Companies and organizations have to consider the privacy of their users’ data, their content, even their browsing behavior for their clients’ benefit and safety
  • But they also do it for their own personal and financial self interest
  • It’s increasingly important that they consider the privacy and security issues affecting their digital experiences
  • So how do we design these digital experiences—apps, websites, etc — to ensure people’s privacy?

Background

4 of 78

Different Concepts

    • Privacy: Your ability to control your personal information and how its used

    • Security: How your personal information is protected by those holding on to it

These concepts often overlap, so we’ll refer to both

Our focus: How we can ensure people’s privacy is maintained as we design experiences for them

Purpose

5 of 78

Why Privacy?

Image by Jack Ferrentino for NPR

6 of 78

“Arguing that you don't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.”

— Edward Snowden, former CIA employee, infamous NSA leaker

Why Privacy?

7 of 78

  • Even if we’re not concerned with a particular privacy issue, we’re not designing for ourselves

  • If we’re designing with empathy, we’ll consider the needs of people not like ourselves—people with different backgrounds and experiences

  • That means researching privacy issues, but also interviewing or talking to people with diverse backgrounds and lived experiences

Why Privacy?

8 of 78

Example:

DayOne, a non-profit which provides services for young people in abusive dating relationships.

Online privacy is a very real issue for these clients, who may be worried about their partners tracking their online activity or even stalking them in real life. Something we’d consider when designing a site for them.

Why Privacy?

Screenshot from DayOne’s website

9 of 78

Similarly, LGBTQ youth need to feel their privacy is secure when reaching out for help online.

In this sense, privacy issues are very often also diversity issues.

Privacy is a key consideration for inclusive design.

Why Privacy?

Screenshot from The Audre Lorde Project’s Facebook page

10 of 78

Privacy Issues

Deepfake of Barack Obama – Photo from Washington Post

11 of 78

Wall Street Journal, 2019

Security experts believe quantum computers will break through our existing encryption technology within a decade.

Experts must determine new ways to protect our data—and quickly.

Imagine if all your passwords suddenly became useless to protect your identity and your personal information.

Data Security

12 of 78

In April 2021, we learned that Facebook, the largest, most popular social media platform on the planet was hacked.

533 million user’s phone numbers and personal data were leaked online.

Data for half a billion people.

Data Security

13 of 78

  • Fraud & identity theft on the rise during the pandemic
  • FTC: 1.4 million reports of identity theft in 2020 — double from 2019
  • Identity thieves targeted government funds earmarked to help people hit financially by the pandemic
  • Leaks of personal data can be catastrophic to people’s lives
  • Cleaning up the mess created by identity theft can take years

Fraud & Identity Theft

Photo by Kyle Glenn

14 of 78

Clearview.ai, a facial recognition platform offers services to law enforcement.

They downloaded over 3 billion photos of people from the Internet and social media and used them to build facial recognition models for millions of people without their permission.

Facial Recognition

Illustration: Elena Lacy for Wired

Photo by Kyle Glenn

15 of 78

Stores, such as Albertsons, Rite-Aid, Macy’s, ACE Hardware are using facial recognition programs to identify customers.

Some also use apps to track customers around their stores to present them with ads online later.

Facial Recognition

16 of 78

During the war in Ukraine, we’ve seen deep fake videos emerge on both sides featuring Vladimir Putin and Volodymyr Zelensky.

These deep fakes potentially enable the spread of misinformation and even trigger dangerous actions.

Increasingly, they can create havoc from a privacy perspective, too.

Deep Fakes

17 of 78

A donation site for Donald Trump deployed “dark patterns” to trick supporters into agreeing to recurring donations, earning the campaign a huge spike in contributions.

Dark Patterns

18 of 78

    • Designers rolled out different iterations of this feature
    • Each with increasingly confusing language, fine print, bold text, all-caps, and a pre-selected check box
    • They referred to the feature internally as a “money bomb”
    • Donations grew astronomically — but so did fraud complaints to banks and credit companies from angry supporters

Dark Patterns

19 of 78

Early 2021, delivery drivers were required to sign consent forms, which allowed Amazon to collect their biometric data* and to use AI cameras to monitor their location, movement, their driving patterns.

At least one driver quit over this form of “AI surveillance.”

*Information about your face, your expressions, body movements, etc

Loss of Privacy

20 of 78

Data Sharing

Just weeks ago, critics began voicing concerns that, if Roe Vs Wade were overturned, personal data could be misused to target pregnant individuals, who may be considering or seeking an abortion.

21 of 78

Data Sharing

Unfortunately, this isn’t really a new issue.

22 of 78

Data Sharing

In a post-Roe USA, we’re already seeing how our data can be used against us.

23 of 78

  • Demand for personalized content which benefits from personal data seems higher than ever
  • People say they want personalized ads, so you’d think they enjoy sharing their data: Let a company follow you around the internet, theoretically, you get better content, better advertising
  • But a 2019 survey by network security company RSA found only 17% of respondents said it was ethical to track their online activity to personalize ads
  • Earlier, Pew Research found 91% of adults believe consumers have lost control over how their personal information is collected and used by companies

Data Sharing

24 of 78

Data Sharing

Sign of the times:

Apple rolled out a new iPhone privacy feature called “App Tracking Transparency,” an anti-tracking shield, which prevents apps from snooping and shadowing you across the internet.

They have to ask first.

Hugely popular in US: Only about 20% of iOS users allowing apps to track them so far.

25 of 78

Data Sharing

The oncoming “cookie apocalypse”

  • Google’s plan to update to its Chrome browser by 2023

  • Will prevent companies from tracking your wanderings around the internet via third-party cookies

26 of 78

  • Some companies aren’t happy with these developments
  • Much of this is about competition among big companies like Apple, Google and Facebook
  • But it’s also a result of consumers’ increasing concerns about the privacy and security of their data and their activity online

Data Sharing

27 of 78

We just discussed a few common privacy and security issues, but in 2019 Smashing Magazine, identified 24 top user privacy concerns. Those included many of the things we just discussed but also …

  • Convoluted privacy policy changes
  • Unwanted notifications and marketing emails 
  • Lack of proper control of your personal data 
  • Making it difficult to delete personal details or cancel your account 
  • Social profiling by potential employers
  • Hidden fees
  • Automatically importing the contact information of your friends
  • Hacked email and social media accounts
  • And so on

Top Privacy Concerns

These are real issues, which matter to real people.

28 of 78

Impact of Regulations

29 of 78

GDPR stands for …

The General Data Protection Regulation

Law finalized in 2016, came into effect in 2018

Regulates how apps and sites can gather and transfer or process personal data when working within the European Union

Also, what happens to that data when it’s transferred outside of the EU?

Impact of Regulations

Remember a while back when you suddenly got a gazillion emails from companies telling you they had updated their privacy policies?

That was a result of the GDPR.

30 of 78

Some things GDPR requires …

  • Ask people to opt in to sharing their data
  • Communicate to people in the moment, when you’re collecting their personal data
  • Be transparent about what you’re doing with it
  • Allow people to download their data and delete it — a “right to erasure” or “right to be forgotten”

Impact of Regulations

31 of 78

In 2018, California passed their own version of the GDPR — the California Consumer Privacy Act — to give Californians more control over how their personal data is used

Requirements very similar to those in the GDPR.

But CCPA differs in that it (currently) allows businesses to collect your information by default—though they still have to offer the ability to opt out

California Consumer Privacy Act 2018

Impact of Regulations

32 of 78

In March 2021, California announced they’re banning “dark patterns.”

And a new “Privacy Options” icon for businesses to show you where to opt out of data collection.

The icon was designed by Carnegie Mellon’s CyLab and the University of Michigan’s School of Information.

Impact of Regulations

33 of 78

New York, Maryland, Massachusetts and Hawaii are developing their own privacy laws, too.

So, if we’re designing for GDPR and California privacy laws and more, then we may as well design for everyone — design for the highest common good.

And assume things will continue to change with more of this emphasis upon privacy, security and transparency.

Impact of Regulations

34 of 78

What’s Our Role?

35 of 78

What’s our role then?

Our Role

“You were not hired to get approval or to have your work pinned to the company fridge.”

“People hire you to be the expert, so you might as well be the expert.”

—Mike Monteiro, designer, co-founder of Mule Design in Ruined by Design

36 of 78

Specifically?

We have a responsibility to act as the advocate for users — but even that’s too abstract.

The term “user” tends to strip people of their individual circumstances, their personality, their history, even their lives.

We have a responsibility to real human beings. 

We may need to push back where necessary in terms our clients understand.

Our Role

Photo by Vince Fleming

37 of 78

We may have to explain to our clients the impacts of ignoring privacy and security concerns.

What are these impacts, specifically?

  • Civic responsibility: As user-centered designers, we really should be encouraging our clients to treat their “end users” as human beings, who are members of their community

  • Reputation management: We may have to remind our clients that what companies do can undermine their brands

  • Using dark patterns may anger people and cause them to abandon your site in favor of another with a more transparent experience

  • Data breaches and sloppy treatment of data may lead to the loss of their user base — likely affecting their profits

  • Financial consideration: Keep in mind the increasing number of laws and regulations and the resulting fines for not following them

Even if there’s an up-front cost to designing for privacy and security, the long-term costs can be devastating

Our Role

38 of 78

In the1940s a Frenchman, Rene Carmille was working on the French Census.

He and his team have been dubbed the first “ethical hackers.” They decided to sabotage their own machines, so the punch cards couldn’t register people’s religion properly.

The team was discovered, arrested by the Nazis and tortured. Carmille died at Dachau.

But they prevented the Nazis from discovering the identities of tens of thousands Jewish people living in France, saving their lives in the process.

They did so by changing an experience to maintain people’s privacy.

Rene Carmille

39 of 78

In 2019, 5 employees quit their jobs at GitHub, a major San Francisco tech company after learning the company shared its data with Immigration and Customs Enforcement or ICE, the government agency, which has been accused repeatedly of human rights violations — especially related to the treatment of immigrants.

It might be scary to speak up in such a situation, but we got into this business to help people — and what we do has a real-world impact.

Our Role

40 of 78

Best Practices

41 of 78

In her Privacy by Design manifesto, Dr. Ann Cavoukian lays out 7 foundation principles for implementing and mapping Fair Information Practices.

She recommends making privacy the “default setting” in our designs, for example, and says privacy should be “embedded” into design. 

So, what are some practical ways to ensure we’re doing that?

Best Practices

Self Study:

Privacy by Design: The 7 Foundational Principles

by Dr. Ann Cavoukian

Founder of Global Privacy & Security by Design and the former Information and Privacy Commissioner for the Canadian province of Ontario

42 of 78

Avoid dark patterns

Dark Patterns

1

43 of 78

Dark Patterns

UX designer Harry Brignull coined the term “dark pattern” in 2010

He defines dark pattern: a “user interface that has been carefully crafted to trick users into doing things” that you didn’t mean to do — like buying or signing up for something

Another researcher described dark patterns as supplanting user value “in favor of shareholder value”

44 of 78

Brignull identified about a dozen types of dark patterns.

Bait and Switch – You set out to accomplish one thing but something else completely undesirable happens.

Confirmshaming – You try to unsubscribe from something, for example, and the feature to opt out uses language to guilt you out of taking action.

Friend spamming – A site asks to access your contacts, so you can find your friends, then it emails all your friends without your permission.

Dark Patterns

Example of confirmshaming

45 of 78

Just for fun, here’s a more carefully crafted unsubscribe message, which still manages to give you one last chance before unsubscribing.

It still places a hurdle in front of you before unsubscribing, but, at least, it’s playful and self aware.

Dark Patterns

46 of 78

Dark Patterns

“Dark patterns are the canaries in the coal mine of unethical design.

A company who’s willing to keep a customer hostage is willing to do worse.”

— Mike Monteiro, Ruined by Design

47 of 78

Dark patterns can expose users’ personal information

When you make a payment on Venmo, it defaults to public, so you automatically share your payments with … everyone

The opposite of designing with privacy as a default

Somebody created Vicemo, which scraped payments listed with words associated with drugs, alcohol or sex and posted them online for all to see

Dark Patterns

48 of 78

Similarly, Strava, a popular app for runners automatically tagged other runners when you passed them if they didn’t change their settings.

This feature even had a name: Flyby.

If you clicked on a face, it showed the user’s full name, picture and a map of their running route — effectively revealing where they lived.

This happened without you following users and without them knowing they were sharing their activity.

After receiving criticism, Strava did change the default setting to private.

But it should have always been private.

“Stalkerware” — Apps which allow people to be tracked — intentionally or not

Dark Patterns

49 of 78

In this example from a major airline, the customer has already chosen Basic Economy but "Move to Main Cabin”— which costs $100 more — is placed as a large red button in the place you’d typically find a "Next" button.

To keep your first selection, you have to click on "Continue with Basic Economy” — the smallest, most low contrast copy on the module.

Patterns like this already interrupt users’ experience but they also undermine people’s trust and even anger them, too.

Dark Patterns

Here the pattern is used to trick people into an upsell.

But the same pattern is used to trick people into sharing their personal information in ways they didn’t intend to.

50 of 78

Be transparent about what personal data is used

What Data Is Used?

2

51 of 78

It’s important to be very specific — especially when sharing PII.

Personally identifiable information — data points such as name, email, phone number, social security number, mother’s maiden name, which can be used to steal people’s identities and to commit fraud

87% of the U.S. Population can be uniquely identified by just their date of birth, gender, ZIP code? (Those items aren’t even considered PII.)

Imagine how much damage a bad actor can do with just 3 data points of PII.

The more personal information someone can collect about an individual, the greater chance they can do a lot of harm.

What Data Is Used?

52 of 78

Be transparent about why specific personal data is collected or shared

Why Is Data Used?

3

53 of 78

Consider this an opportunity to explain the benefits of sharing their data:

  • Does it ensure a better experience in the future?
  • Does it personalize ads and offers for them?

Be prepared to explain those benefits in detail.

And if you can’t, consider whether you’re designing the right sort of product.

Why Is Data Used?

54 of 78

Why Is Data Used?

The home insurance app Lemonade sets a great standard for digestible privacy policies.

They include an itemized, detailed explanation of what personal information you’re sharing, and they also explain why.

They also promise never to sell your information to third parties.

TL;DR: We will never, ever, sell your data to anyone.”

55 of 78

Always use clear, approachable language

Clear Language

4

56 of 78

Clear Language

In 2019, The New York Times studied 150 privacy policies from various tech and media platforms. They described what they found as an “incomprehensible disaster.”

They described AirBnB’s privacy policy as “particularly inscrutable.”

“This information is necessary for the adequate performance of the contract between you and us and to allow us to comply with our legal obligations.”

Vague language and jargon allow for a wide range of interpretation, making it easy companies to defend their practices in a lawsuit while making it harder for us to understand what’s really going on with their data.

57 of 78

Twitter advises you to read their privacy policy in full but highlights key aspects of it up front — in a dedicated section — advising you to pay attention to those particular things

Clear Language

58 of 78

Clear Language

Tru Luv is an app, which focuses on “healing our relationship with technology.”

They use simple, straight-forward language to explain how your personal data is used when you’re setting up the app:

“If you like, you can help TRU LUV by giving permission to collect anonymized data through Unity Analytics. This helps them catch bugs and blips.”

They prompt you to give permission or to opt out before you even proceed.

59 of 78

Some guidelines:

  • Avoid legalese and jargon: Even your terms and conditions content doesn’t have to sound like legal content
  • Consider different ages groups and levels of savviness
  • Most adult Americans read at about a basic or intermediate literacy level
  • 50% can’t read a book written at an 8th grade level
  • The Content Marketing Institute recommends writing for about a 14- or 15-year-old (about the 8th grade)
  • Carefully crafted personas can help determine if an experience’s reading level should vary from that range

Clear Language

Photo by John-Mark Smith

60 of 78

Give users options to control their own data

User Controls

5

61 of 78

User Controls

Google offers a Privacy Checkup with high level descriptions of how your personal data is being used and why.

This links to specific Privacy Controls, which allow you to adjust how that data is accessed.

They allow you to turn off activity tracking, location history, your YouTube history, your Google photo settings, check which third parties have access to your account information, and access other key settings all in one sort of privacy dashboard.

62 of 78

A good moment to recall Dr. Cavoukian’s maxim:

Keep these settings private by default

User Controls

63 of 78

This module shows the site is keeping your cookie settings to a minimum by default.

In this example, the messaging is prominent, but …

Are these cookies set to the minimum?

Or the maximum?

You can’t be sure without opening the “Manage Cookies” link to find out.

They do make it super easy for you to “Accept & Continue” tho.

User Controls

How could you design a more streamlined version, which still makes the options clear?

64 of 78

Ensure these privacy features are placed contextually and easy to find

Easy to Find

6

65 of 78

Easy to Find

Such important information

shouldn’t be placed in 8-point font …

buried in the Terms & Conditions …

hidden in the footer …

or several levels of navigation down deep in your app

— and yet, that’s often where we find it

This is where a feature like California’s new “Privacy Options” icon would come in handy, too – to draw additional attention to these privacy options.

66 of 78

Easy to Find

Contextual and easy to find also means …

Onboarding — Explaining in detail how you use people’s data when they’re using your app for the very first time.

“Just in time” alerts – Alerting users in the moment—when they’re about to share data in a new way—even if they have a history of using your experience.

Photo by Sigmund, Unsplash

67 of 78

Easy to Find

Mozilla displays robust Privacy information by default in a dedicated tab when you download and open their Firefox browser for the first time.

68 of 78

Easy to Find

Facebook offers a Privacy Checkup one or two clicks away everywhere on its desktop site. It’s pretty easy to find on the app as well.

(If you know to look for it.)

69 of 78

Easy to Find

Onboarding example from Babbel

70 of 78

Remind users regularly about their privacy options

And actively encourage them to take advantage of them

Reminders

7

71 of 78

Reminders

Facebook lets you to set reminders to do a privacy checkup every week, month, 6 months or year

Google also has a feature, which will send you a reminder to check your privacy settings.

72 of 78

One final point:

Never change users’ privacy settings without telling them in advance.

They should also have the option to opt out of such changes.

Never Change Without Notice

8

73 of 78

A few years ago, Facebook made users’ “likes” visible overnight, which consequently may have outed some people in the LGTBQ community or revealed people’s personal, political or religious beliefs.

When I asked an employee how they justified this change, they responded that the company valued transparency and wanted people to be transparent about their interests.

The company’s founder, Mark Zuckerberg, had even famously said privacy was no longer a “social norm.”

Never Change Without Notice

74 of 78

We don’t have the right to make decisions about other people’s personal data and interests on their behalf.

Assuming everyone’s information can safely be made public is a belief that comes from a position of privilege.

We should never make decisions like this, which can profoundly affect people’s privacy without their explicit consent.

Never Change Without Notice

75 of 78

In Conclusion

76 of 78

We talk a lot about “empathy” in design.

If we design with empathy, we won’t design experiences we wouldn’t want to use ourselves.

And we won’t design using “dark patterns” either.

Conclusion

Photo by Josh Calabrese

77 of 78

Privacy is not about secrecy.

It’s all about control.

— Dr. Ann Cavoukian

If we want to ensure people have control over their own personal information

If we want to ensure the experiences we design are user friendly and truly “user-centered”

We’ll keep these best practices for privacy in mind

Conclusion

Photo by Zanardi, Unsplash

78 of 78

Thank you!