Detecting and analyzing Internet shutdowns with IODA

Facilitator: Philip Winter

Understanding the Economic Impact of Internet Censorship

Facilitator: Ihsan Ayyub Qazi

Psiphon: Software for the Soft War?

Facilitator: Nathalie Marechal

Gendered Surveillance: mapping actors, practices and excuses used to control female and non-binary bodies

Facilitator: Joana Varon

Technical Measurement of National Information Controls Models

Facilitator: Valentin Weber

Let's Talk: Digital Security Trainings

Facilitator: Sukhbir Singh

Privacy issues in budget smartphones operating systems

Facilitator: Francis Monyango

Platformocracy: Tracking Governance by Platforms Around the World

Facilitator: Cynthia Khoo

Quien Defiende Tus Datos: Corporate responsibility on the fight for privacy

Facilitator: Leandro Ucciferri

IoT Privacy and Security Standards

Facilitator: Tatevik Sargsyan

Mapping Discriminatory Effects of Website Blocking

Facilitator: Joss Wright

Lower-latency Anonymity

Facilitator: Benjamin VanderSloot

Measurements from Embedded Devices

Facilitator: Simone Basso, Vasilis Ververis

Why Transparency Reports Need To Be More Transparent

Facilitator: Sergei Hovyadinov

Creating assessment tools for algorithmic accountability

Facilitator: Andrea Hackl

Understanding an electoral system assisted by technology: The Colombian case

Facilitator: Pilar Saenz

Emerging Issues in Online Speech and Censorship in Canada

Facilitator: Jon Penney and Lex Gill (and Bram Abramson TBC)

Integrating gender awareness for achieving excellence in research and intervention for digital rights and freedoms

Facilitator: Temitope Ogundipe

Researching Gender and Cybersecurity

Facilitator: David Kelleher and Irene Poetranto

Machine-Assisted Decision Making in Immigration and Refugee Cases

Facilitator: Petra Molnar and Lex Gill

Technology for Emerging Threat Models

Facilitator: Erinn Atwater, Cecylia Bocovich, nesf

International proliferation of criminal sanctions against re-identification of personal data

Facilitator: Mark Phillips

Disrupted Lives: Contextualizing internet shutdowns in data and stories

Facilitator: Angel

Making OONI’s censorship data more actionable

Facilitator: Maria Xynou

Speculative Fiction Writing for Activism: Digitalization in the Global South

Facilitator: Amy Johnson and Grace Mutung'u

Making the Transparency Report complete: Detailing the Surveillance Part

Facilitator:  Ho Ming-Syuan and An Jungbae

Tracking and Mitigating the Impact of Automated Threats to Human Rights Online

Facilitator: Jon Penney and Joss Wright

Holding Businesses Accountable for their Human Rights Abuses Abroad

Facilitator: Jon Penney

Evaluating Stalkerware

Facilitator: Adam Molnar

Armed conflicts and information operations

Facilitator: Ron Deibert and Ksenia Ermoshina

Mapping Digital GULAG

Facilitator: Alexander Isavnin

Information Controls in the CIS: showcasing OTF ICFP research projects

Facilitator: Sergei Hovyadiov, Igor Valentovich, Ksenia Ermoshina, Arzu Geybullayeva

Protecting OFF line Human Right Defenders

Facilitator: Ariel Barbosa

Journalists in exile:  risk perceptions, threat-models and professional transformations of “exiled media”

Facilitator: Ksenia, Lobsang, and Masashi

Purpose and Ethics in Malware Disclosures

Facilitator: Matt Brooks

Citizens and Surveillance: Understanding and measuring self-censorship in quasi-democratic regimes

Facilitator: Chris Gore

Status code 451
Facilitator: Mallory Knodel

An End to End Analysis of VPN Services

Facilitator: William Tolley, Zack Weinberg and Mohammad Taha Khan

Right of Access, proposals for empowerment

Facilitator: Rene Mahieu and Andrew Hilts

Implications of "Going Dark" in Developing Countries

Facilitator: Sunoo Park (MIT)

Detecting and analyzing Internet shutdowns with IODA

Facilitator: Philip Winter

Session Description:  In this session I will introduce IODA, a system to detect and analyze large-scale Internet shutdowns. We will talk about how IODA works, how it can help with understanding Internet shutdowns, and we will explore its user interface by looking at past Internet outages. Finally, I'll do my best to answer all questions. While the session is going to be somewhat technical, everyone's welcome!

Session Objectives: The first goal is to introduce IODA and show what it can do for you. The second goal is to get feedback: What do you like and dislike about IODA? What can we do better? How can we make it more useful to you?

Session Length: 1 hour

Understanding the Economic Impact of Internet Censorship

Facilitator: Ihsan Ayyub Qazi

Session Description: This session will involve discussions and talk(s) on how Internet censorship and the use of censorship circumvention tools (e.g., Tor and Lantern) impacts content providers, ISPs, online advertisers, and users.

Session Objectives:

1. To gain a deeper understanding of the impacts of Internet censorship on different stakeholders in the Internet ecosystem (e.g., ISPs, content providers, and online advertisers)

2. To explore ways to reliably measure the impact of Internet censorship on users, ISPs, content providers, and online advertising

3. To encourage studies and collaborations around measuring the impact of censorship on different countries

Session Length: 1 hour

Psiphon: Software for the Soft War?

Facilitator: Nathalie Marechal

Session Description: After four long years, I have finally completed my PhD dissertation, “Use Signal, Use Tor? The Political Economy of Digital Rights Technology.” The study examines the power relationships behind the development and funding of “digital rights technologies” (censorship circumvention, online anonymity, and secure messaging tools) through four case studies: Psiphon, Tor, Signal, and Telegram. Along the way, I take a closer look at the transnational social movement for digital rights, the U.S. internet freedom agenda, and the peculiar relationship between these two worlds.

I’ve interviewed many CLSI participants for this project over the years, and I want to both give back by sharing what I’ve learned and also seek feedback from this community. What did I get right, what did I get wrong, and what did I miss? As I get ready to submit a book proposal to publishers, I would be very grateful for another chance to learn from the community.

In this session, I'll provide an overview of my research project and present the Psiphon case study chapter, titled "Psiphon: Software for the Soft War?".

Session Objectives:

1) Reflect sociological research into digital rights activism and technology development back to the studied population, in keeping with participatory research principles

2) Receive feedback on this research to ensure it reflects the experience(s) and beliefs of the studied community, including Psiphon developers and users

3) Improve the study in advance of submitting a book proposal to publishers

Session Length: 2 hours

Gendered Surveillance: mapping actors, practices and excuses used to control female and non-binary bodies

Facilitator: Joana Varon

Session Description: Most discussions of surveillance in recent years have been focusing on State surveillance and its intersection with the massive data collection that underlines the business models of tech corporations.

But this overlooks the different types of surveillance that women and non-gender binary people face based on their gender and sexual preferences. Unlike surveillance, this kind of surveillance is perpetrated by family and community members, in addition to States and corporations under the excuses of morality, public health, population control, normativity and even marketing.  

Very vocal feminists questioning gender roles, women in politics, female journalists, people working or exercising their sexual and reproductive rights, gender non-binary people performing against gender and/or heteronormativity or even women that are not politically active certainly are being targeted of different surveillance tactics and censorship to maintain the values of patriarchy.

In this session we would like to to kick start a discussion based on concrete cases collected by digital security trainers acting to support actions against online gender based violence and surveillance and extrapolate these cases towards a broader methodological framing to understand what gendered surveillance means and what are the tactics to counter it that are different from the traditional surveillance discourse and counter tactics.

Session Objectives:

The set of actors, practices and excuses to perform gendered surveillance is significantly different, therefore, it poses us a different set of questions to understand surveillance through a gender lenses and to propose measures to counter it.

Focusing on this challenge, this session will have the objectives to:

Expand the diagnostic: Amplify the discussion of surveillance to add a gendered and intersectional lenses to it.


Map a series cases of surveillance and censorship performed due to gender and build a matrix of actors, practices and rationale in order to  allow for a better understanding of this aspect of surveillance.

Counter Gendered Surveillance: Brainstorm on the existing actions and the actions needed from the digital security community to counter this particular kind of surveillance.

Session Length: 1 hour

Technical Measurement of National Information Controls Models

Facilitator: Valentin Weber

Session Description: The aim of this session is to identify intrinsic characteristics of a country's approach to information controls and the diffusion thereof. What technology, or techniques are visible via network measurement? How can they be measured across time? Is the measurement scalable to a global level? If not, what are the primary inhibiting factors? The session will focus on countries that reflect the audience's interests and could include countries in Asia, Africa, Latin America, or Eastern Europe.

Session Objectives: The goal is to bring together a technical and social science audience, coming from various geographical backgrounds, in order to exchange ideas and personal experience that will facilitate cross-disciplinary and geographical insights. The primary objective is to identify technical tools and techniques that can be used or that might be developed in order to track the characteristics of national information controls models.

Session Length: 1 hour

Let's Talk: Digital Security Trainings

Facilitator: Sukhbir Singh

Session Description: Let's talk about digital security trainings and share our experiences as participants and trainers.

Session Objectives: This session will talk about digital security trainings from the perspective of both participants and trainers, so come and share your experience about what worked and what didn't.

Session Length: 1 hour

Privacy issues in budget smartphones operating systems

Facilitator: Francis Monyango

Session Description: The session will be on the privacy implications of mobile phone affordability and security vulnerabilities for low income segments of the market. These budget smartphones usually have the outdated versions of the android OS which also does not require so much disk space. They come loaded with a host of default applications like browsers, messaging apps and social media apps and no permission settings. This has the effect of exposing users to many security risks while breaching their privacy.

Only 23 African countries have put in place comprehensive privacy and data protection laws. Which means the many of the smartphone owners in the countries without privacy laws are left to their own devices when it comes to matters privacy and online data protection. These smartphones may be exposing their users to unethical data mining by big internet companies.

While there is excitement that mobile devices are reducing the digital divide in Africa, it is fair that privacy standards are enforced to ensure fundamental rights like privacy of smartphone consumers are protected. Communications regulators ought to insist that phones with OSs with clear permission options and regular software updates are accepted into their markets.

Session Objectives:
To discuss the regulation of smartphones OS standards for purposes of protecting the privacy of low-end smartphones users.


To discuss the role of the communications regulator in this situation and probably having the regulator take up a technical quality audit role which includes conducting device OS quality tests.

To discuss the policy considerations that will have to be in place if the regulator take up the role of inspecting the quality of smartphone OSs.

To discuss if regulators have a duty to create awareness among these consumer grouping on the risks they are exposed to when using some phones.

Session Length: 1 hour

Platformocracy: Tracking Governance by Platforms Around the World

Facilitator: Cynthia Khoo

Session Description: Participants will be asked to review a proof-of-concept of a global database to track instances of "governance by platform" around the world. Examples include: how a Google AdWords policy changed the rehabilitation centre industry; how Facebook can influence users’ moods, change voter turnout, and enable illegal housing and employment discrimination; and how Airbnb distorts entire cities’ rental markets and exacerbates housing crises. Such phenomena are often covered individually in-depth by media, researchers, or scholars, but it would be valuable to have the simple facts of their occurrence systematically recorded in one database, that could be sorted by, for instance, country, topic, platform, and date, for reference and further analysis. This session will put forward a proposal for how such a database might work, and will seek feedback and advice from participants to make it as effective and useful as possible.

Session Objectives: The objective of this session would be to review and give feedback on a proposed global database that tracks instances of major online platforms engaging in overt or indirect acts of governance, which systematically impacts users and society more widely. This would not be a content takedown or censorship database, but is inspired by them (e.g. Lumen). It would log, for example, certain policies or decisions that have wider, systemic repercussions on not just the platform’s user base but on politics, social welfare, human rights, or democracy more broadly. Feedback would directly inform the next stages of the project in building and populating the database itself.

Session Length: 1 hour

Quien Defiende Tus Datos: Corporate responsibility on the fight for privacy

Facilitator: Leandro Ucciferri

Session Description: The regional investigation "Who Has Your Back" (Quien Defiende Tus Datos in spanish), developed between EFF and different partners in Latin America, seeks for a better  comprehension of what are the best practices that ISP's can adopt in order to align with human rights standards on the Internet.

Also, it aims at a general understanding on how companies are currently working in regard to data protection in order to developed recommendations that can be adopted by them at short and long term periods.

This session will explain-in an open discuss format- about the findings of this research in different Latin American contexts: what are the current standards that companies adopt in regard to privacy and surveillance? what are the guarantees that this billion dollar companies offer to its clients? This and more issues will be addressed in this session.

Session Objectives:

1- Understanding the role of companies in the protection of human rights on the Internet.
2- Best practices that ISP can adopt to protect privacy of its customers.
3- General description of the Latin American context of ISP's in relation to privacy and data protection.

Session Length: 1 hour

IoT Privacy and Security Standards

Facilitator: Tatevik Sargsyan

Session Description: As part of the ongoing collaborative testing project managed by Consumer Reports that uses the Digital Standard to assess IoT products, RDR helps develop new privacy and security standards raised by emerging technologies for incorporation into the Standard. Our assessment of over 20 IoT products has revealed the need to develop additional standards to address IoT privacy and security, including due diligence in the complex IoT supply chain and related corporate policies and disclosures. With help from a broad community of experts, we hope to develop new standards and establish effective IoT assessment framework for consumer organizations that also serves as a set of guidelines for companies in the IoT ecosystem.

Session Objectives: The main objective of the session is to invite privacy and security experts to collaborate and contribute to our open source project and help us develop new criteria dealing with IoT privacy and security to be incorporated into the Digital Standard. Another goal is to raise awareness about the Digital Standard and invite experts and organizations to contribute to all of its components through GitHub. We are continuously reviewing recommendations and issues raised on GitHub and revising our standards and accompanying methodology.

Session Length: 1 hour

Mapping Discriminatory Effects of Website Blocking

Facilitator: Joss Wright

Session Description: This session will discuss the range of discriminatory effects that information controls can have. This includes 'primary' effects when explicitly aimed at a disadvantaged group, but also 'secondary' effects when such groups are not directly the focus of the information controls: are particular groups disproportionately affected when information related to, for example, sexual health, is blocked?

We are interested both in 'human' discriminatory decision-making, but also the effects of values encoded into algorithmic decision-making in filtering

Session Objectives:  The top-level objectives of this session will be: To gather and identify case studies: we are particularly interested in speaking to individuals with experience of the secondary effects of filtering, and to speak to groups with whom we can co-create the research questions going forward. Are there groups that we have not considered? What topics might have particular secondary effects?

 To develop a framework for analysis: what are the useful question to be answered, and how might we best go about answering them?


To identify data sources and partners: a number of projects already have data on blocking events; this session will discuss how these might best be used as a component of a larger project to understand the effects of information controls.

Collaborative work: We are keen to identify partners, large and small, that might be interested in joining a project to investigate these effects.

Session Length: 3 hours

Lower-latency Anonymity

Facilitator: Benjamin VanderSloot

Session Description: In this session we will explore people's perceptions about and ideas for anonymity techniques that have lower latency behaviors than Tor. This will likely include discussions of appropriate threat models and what designs are most effective under each.

Session Objectives: While we do not have concrete deliverables, in this session we hope to identify a few self-consistent models of perceptions, threats, and designs.

Session Length: 1 hour

Measurements from Embedded Devices

Facilitator: Simone Basso, Vasilis Ververis

Session Description: Initiate a conversation of the pros/cons of networks measurements submitted from embedded devices

Session Objectives: - Discussion on more privacy preserving and less risky network measurements
- Present reports of internet censorship incidents and use cases that will not be possible without hardware based platforms

Session Length: 1 hour

Why Transparency Reports Need To Be More Transparent

Facilitator: Sergei Hovyadinov

Session Description: Transparency reports, which tech giants publish once or twice a year, were meant to shine a spotlight on government censorship around the world. Some critics argue that these reports have now begun to serve a very different goal -- to demonstrate to local authorities that a company is serious about compliance with their local demands.

Based on my analysis of what Twitter reported in its latest transparency report and what it actually removed in Russia, I aim to demonstrate the gaps and provide some suggestions how to make transparency reports more meaningful.

Session Objectives: I welcome to this session anyone who works on internet censorship and the role of intermediaries as proxy censors for autocratic governments. The goal of the session is to brainstorm and build strong arguments for advocacy campaign in order to encourage tech giants to share more data about their content removal practices.

Session Length: 1 hour

Creating assessment tools for algorithmic accountability

Facilitator: Andrea Hackl

Session Description: Ranking Digital Rights works with an international community of researchers to set global standards for how ICTs companies should respect users’ rights. We produce an annual Corporate Accountability Index that ranks 22 of the world’s leading ICTs companies based on 35 indicators that assess companies’ respect for user privacy and freedom of expression. We are currently responding to concerns over the threats that internet companies’ algorithms can pose to user rights by expanding our Index to create indicators that will set transparency standards around companies’ deployment of algorithms. We hope to include these indicators in our 2019 Corporate Accountability Index.

Session Objectives: From our work with the RDR Corporate Accountability Index, as well as several Index adaptations that address various digital rights issues, we know that stakeholder engagement around methodology revisions is key to ensure indicators reflect the needs of diverse communities. We hope to take advantage of the Citizen Lab Summer Institute to workshop the indicators we have drafted to measure company’s algorithmic accountability, and to discuss how users worldwide can use the indicators we are adding to the 2019 Index to shine light on emerging digital rights concerns.

Session Length: 1 hour

Understanding an electoral system assisted by technology: The Colombian case

Facilitator: Pilar Saenz

Session Description: Since October of last year, Karisma, through its privacy and security lab,  K+LAB, has been supporting the Electoral Observation Mission in the construction of an audit protocol for the software used to support the electoral system in Colombia. This work and the analysis of the problems that have arisen in the past with this system has allowed us to elaborate some hypotheses about some minimums necessary to guarantee confidence in the Colombian electoral system when technology is introduced in the processing of voting results. Other questions about citizen accountability, data analysis and privacy remain open.

Session Objectives: Disclose Karisma's analyses about colombian electoral system and the use of technology inside it. Receive feedback especially about the minimum required for the tracking and verify of the elections results and the best practices to communicate the findings of this kind of research. Receive feedback of audit protocol.

Session Length: 1 hour

Emerging Issues in Online Speech and Censorship in Canada

Facilitator: Jon Penney and Lex Gill (and Bram Abramson TBC)

Session Description: This session aims to a critical, facilitated discussion on online speech, content moderation, and censorship in Canada. We hope to convene legal and policy experts to discuss common themes, competing rights, and emerging challenges through the lens of three recent issues — the Google v. Equustek decision, the Privacy Commissioner's Draft Position on Online Reputation (including the adjacent right to be forgotten debate), and the recent FairPlay Canada controversy. In light of Citizen Lab's recent reporting on Netsweeper, we will also discuss the role of Canadian companies in respecting freedom of expression in their business activities abroad. The discussion will be framed through a series of extremely brief expert presentations, followed by a facilitated, in-depth conversation among all participants. The focus will be on the Canadian context but comparative and international perspectives are both welcome and encouraged.

Session Objectives: We hope to explore creative, frank, and practical emerging angles on the Canadian freedom of expression debate. The goal is to develop a better shared understanding of the various legal rights at stake, their proposed remedies, and the complex policy implications inherent to these debates. This conversation is an opportunity to identify new insights, unexpected points of consensus, and future research directions.

Session Length: 2 hours

Integrating gender awareness for achieving excellence in research and intervention for digital rights and freedoms

Facilitator: Temitope Ogundipe

Session Description: A lack of gender awareness contributes to maintaining, and possibly even strengthening, an undesirable status quo of pervasive gender inequality. It is very important to research the ways women experience themselves and their daily realities in the information society, whether and to what degree their aspirations for participation are met, the degree to which they are excluded from participation, and what needs to change in order for women to benefit from the information society and to co-create it.

Session Objectives: A discussion on women's rights online and barriers to internet access
and use. There are various barriers women face in using the internet including cost of devices and data, lack of awareness education, and digital skills. The session is aimed to explore these barriers, in particular concerns around safety and security and what is to be measured and understood about how women experience online rights issues differently from men, especially in more patriarchal societies and how this contributes to the gender digital gap in Africa. It is hoped that a research is designed out of this session that can inform an expansion of women’s right to privacy and stronger privacy policies in online platforms and workplaces that can protect and defend women against harassment and other forms of tech-mediated violence.

Session Length: 1 hour

Researching Gender and Cybersecurity

Facilitator: David Kelleher and Irene Poetranto

Session Description: The session will open with introduction from Ron on Citizen Lab's interest on researching the intersection of gender and cybersecurity, e.g., making a submission to the UN Special Rapporteur on Violence Against Women (https://citizenlab.ca/2017/11/submission-un-special-rapporteur-violence-women-causes-consequences/ ).

A full agenda can be found here:

https://docs.google.com/document/d/11X475RVeK05AIfImrzJrJjX7Zoi9khS7vohaTGlRXNc/edit

The intro will be followed by a short presentation by Santiago and Pepe from R3D and John Scott-Railton from Citizen Lab on our Reckless series of reports and the female victims involved (https://citizenlab.ca/tag/reckless/) (5 minutes)

A short presentation by Kemly from Sula Batsu and Ariel from Colnodo on their gender-related projects, funded by IDRC (5 minutes each)

- Open the floor to participants to discuss questions such as the following, facilitated by David Kelleher (Gender@Work - IDRC's gender consultant) and I:

1) In your experience / country / region, what are the most pressing questions on gender and cybersecurity? Aside from gender, what are other aspects of identity (intersectionality) that we should consider?

2) What do you think are the gaps in research / the literature?

3) What are some the pitfalls on researching this topic that we should be aware about?

4) How can the Citizen Lab best contribute?

Session Length: 1 hour

Machine-Assisted Decision Making in Immigration and Refugee Cases

Facilitator: Petra Molnar and Lex Gill

Session Description: This session convenes CLSI participants, outside academics, and civil society on the issue of AI/ML assisted decision making in immigration and refugee law. It will focus on challenges inherent to algorithmic decision-making schemes adopted by government: from due process and transparency to explainability, privacy, surveillance, bias, safety, equality, and non-discrimination. We will specifically explore the ways in which these issues may be complicated and exacerbated in the immigration and refugee context, which often involves extraordinary consequences and vulnerable individuals. Participants with technical interest in machine learning, AI, and automated decision making are particularly encouraged to attend. We are also seeking insight from lawyers and policy folks with expertise in Charter and international human rights law, immigration and refugee policy, and administrative law.

In terms of structure, will provide a brief update of our research to date, seek feedback on that work, and facilitate a critical discussion on next steps in this emerging field. Our conversation will span the full lifecycle of immigration and refugee related decisions, from initial applications to enforcement, detention and deportation.

Session Objectives: The goal of this session is to solicit engagement on ongoing research coordinated by the International Human Rights Program and Citizen Lab, and to bring together experts across technical and legal disciplinary boundaries on this issue. Outcomes will include contributions to the final report, including recommendations that will provide insight to policymakers in Parliament, OPC, Treasury Board, DOJ, Immigration and Citizenship Canada, and other federal agencies considering the adoption of these tools. We are also interested in exploring international human rights law impacts and comparative perspectives from non-Canadian jurisdictions.

Session Length: 3 hours

Technology for Emerging Threat Models

Facilitator: Erinn Atwater, Cecylia Bocovich, nesf

Session Description: Privacy and communication technologies can empower users and offer a safe means to organize in situations where policy and advocacy break down. However, when these tools are designed in a vacuum without the input of marginalized populations or experts familiar with the social conditions in which they will be used, they can end up causing more harm than good or reinforcing oppressive structures. Our intention with this workshop is to bridge the gap between technical development and expertise in policy, human rights, advocacy, journalism, and marginalized user threat models, with a focus on designing usable tools that can then be deployed to resist censorship, surveillance, compelled data disclosure, and restrictions on information sharing.

We envision the structure of the workshop to comprise of discussions on state-of-the-art technologies and emerging issues, with the majority of time spent discussing these various issues in small groups of participants with a diverse set of expertise. We would like each group to contain at least one technical expert and to focus itself around the discussion of a specific technological problem (e.g., network interference, compulsion resistance, or information sharing). Participants will rotate between groups in order to offer input on the full range of topics we are able to provide technical experts on.

Session Objectives:

- identify emerging threat models that fit with existing technologies
- share knowledge across digital rights domains
- critically assess gaps in existing technologies

Session Length: 1 hour

International proliferation of criminal sanctions against re-identification of personal data

Facilitator: Mark Phillips

Session Description: Legal scholarship has recently begun advocating for criminal penalties prohibiting anyone from re-identifying people within ostensibly "anonymized" datasets. Legislative bills that would create criminal re-identification offences carrying prison sentences are now before the UK and Australian Parliaments. Lawyers insist that because anonymization has now been shown to fundamentally fail at its goal of providing a generalizable technical approach to reconciling analytics with privacy protection, we're left with no choice but to turn to a legal solution: severe penalties to deter everyone from misusing freely circulating, exploitable data sets. Privacy and security researchers are unsettled, to say the least, to see that their work appears to fall within the offences as described. A chill on this type of research effectively deprives everyone of the understanding of re-identification risk that is required to make informed privacy decisions, on both an individual and policymaking level. From a historical and empirical perspective, the deterrent effect of severe penalties also tends to be greatly overestimated, particularly when attribution poses challenges. This session will provide an overview of current debates around identifiability and criminalization, and will open into a discussion exploring legal threats facing security and privacy research.

Session Objectives: In the first half of the session, I will present the current trend in the legal literature around this topic. In the second half, I'd like to open into a discussion of the implications for privacy research and to privacy itself. What similarities and differences exist between the privacy-specific context of re-identification when compared with earlier debates around security researchers, for instance around open standards versus security-through-obscurity? What lateral links, if any, can be drawn with other issues that affect research confidentiality and research openness, such as compelled disclosure of research data, or defamation lawsuits that certain researchers are being threatened with by companies whose software products they examine? If sufficient interest exists, the discussion will end by exploring the possibility of following up with a collaborative research report analyzing and explaining the current trends.

Session Length: 1 hour

Disrupted Lives: Contextualizing internet shutdowns in data and stories

Facilitator: Angel

Session Description: Our session aims at sharing the latest version of the #KeepItOn Toolkit, an integrated toolbox with a strategic framework to combine network disruption detection research and advocacy efforts to slow and eventually stop the trend of government ordered internet shutdowns around the world. The toolkit, one result of over two years of the global #KeepItOn coalition's fight to stop internet shutdowns, connects approaches and strategies on various fronts -- from research on network disruptions and impact of shutdowns, policy advocacy work to engage governments and tech industry, to grassroots advocacy and public education -- to map out a clear picture for civil society to prioritize and advance their work to protect human rights on and off line.

In this session, we would like to bring together researchers, policy advocates, and civil society actors to scrutinize the toolkit at its current state, discuss and develop methodologies for communities on the ground to further leverage the toolkit, and start building a portfolio of region-specific use cases to harness the connection of advocacy endeavors across the world.

Several key tools in this toolkit include the #KeepItOn STOP tracker, which is a global contextual shutdown instances tracker that we previously shared at the CLSI 2017, and the #KeepItOn Shutdown Story Project, a documenting project to amplify personal voices and narratives to illustrate the impact of internet shutdowns through the lens of people on the ground. Both projects are employing the power of visual storytelling, translating the way we look at data into a universally shared language for people in various sectors to systematically evaluate the impact of shutdowns and address a common ground to stop the trend of this kind of information and communication disruption.

The format of our session will be:
--15 minutes of updates on developments in #KeepItOn toolkit since 2017, including creation of Shutdown Stories project and publication of STOP tracker/map
--40 minutes of small group breakouts focused on 1) Shutdown Stories project, 2) STOP methodology, and 3) coalition/community needs
--5 minutes of Q/A and feedback; roadmap and planning

Session Objectives: This session will be a post-RightsCon Toronto 2018 reconvening for researchers, policy experts, and advocacy campaigners working within and outside of the #KeepItOn coalition to exchange ideas and updates, further refine the toolkit, and share lessons and resources.

The objectives of the session will focus on the following areas:

1. Evaluating the STOP methodology to more accurately, consistently, and effectively track shutdowns
2. Aligning expectations and terminology in measurement and database construction on the issue of intentional disruptions
3. Growing the community's awareness and reach of Shutdown Stories project
4. Gathering use cases and feedback for the toolkit, especially from the technologist and researcher community.

Session Length: 1 hour

Making OONI’s censorship data more actionable

Facilitator: Maria Xynou

Session Description: The Open Observatory of Network Interference (OONI) develops free and open source software designed to measure internet censorship. Thanks to its expanding global community, millions of network measurements have been collected from more than 200 countries since 2012. New measurements are automatically published by OONI every day, many of which contain evidence of internet censorship. But how can such data become more actionable? Which questions do researchers want to answer through censorship measurement data? How can OONI support advocates interested in working with censorship data?

This workshop will involve a discussion on how to make OONI data more useful for researchers, lawyers, and human rights advocates. Participants are encouraged to share the questions that they would like OONI data to answer, as well as their needs and challenges in using OONI data. As part of this workshop, participants will also brainstorm on strategies for building a global network that coordinates on monitoring and responding to censorship events worldwide.

Session Objectives: The objectives of this workshop include:

* Collect research questions that community members want to answer through censorship measurement data

* Better understand community needs pertaining to the use of censorship measurement data

* Better understand community challenges pertaining to the use of OONI data

* Collect feedback and ideas on how to make OONI’s data more useful and usable by researchers, lawyers, and human rights advocates

* Collect ideas on how to coordinate better with community members on responding to global censorship events

The above objectives can inform the development and improvement of OONI’s tools, datasets, and methodologies.

Session Length: 1 hour

Speculative Fiction Writing for Activism: Digitalization in the Global South

Facilitator: Amy Johnson and Grace Mutung'u

Session Description: This session, led by Grace Mutung’u and Amy Johnson, explores the use of speculative fiction writing prompts in activism, with a particular focus on digitalization and the global south. The first half of the session equips participants with hands-on experience in using speculative fiction prompts themselves. These initial prompts will be themed around conversations that have emerged from earlier sessions at CLSI. The second half builds on this foundation to 1) collaboratively create new prompts around digitalization and the global south, and 2) brainstorm challenges and possibilities in this context. No previous experience with speculative fiction writing is necessary.

Session Objectives: To introduce speculative fiction as a tool for collaborative imagination and public engagement.

To bring a much needed spotlight on digitalization and its effects on humans and society in the global south.

To improve and expand our toolkit for the particular use case of activism around issues of digitalization.

Session Length: 3 hours

Making the Transparency Report complete: Detailing the Surveillance Part

Facilitator:  Ho Ming-Syuan and An Jungbae

Session Description: After running the Taiwan Internet Transparency Report and handle some privacy cases in the past three years, I found one of the most difficult problem in the transparency issues (especially in the privacy part) is that the information or surveillance types that could be revealed in the report now is not enough for most readers. People want to know more about concrete details. The clear policies, statistics, laws, cases can not satisfy people, and often is not enough to raise the public awareness in the end.

In the past years, in the procedure to do the transparency report, we also found that due to some corporations do not cooperate with government, so that the government (especially the police units) will try to gain people's personal data through install lots of sensors, establishing large database by data exchange, using other intrusive, mobile techniques (such as XRY, Stingray, or some remote control software. And the most important thing is that this part is not necessarily be secret, in fact, lots of them could be found in some public documents. For example, in Taiwan, the police units will "very briefly" introduce what technologies or databases they are using now to do surveillance in some report or magazines. What we need to do is to find it, investigate it and list it, systematically.

This may be another large part needed to be disclosed. Without this part, the transparency report would be incomplete and insufficient. This session would like to discuss what kind of other surveillance information should be disclosed, what are the basic information people should know, and how to investigate it.

Session Objectives: 1. Share some basic situation in Taiwan and other countries.
2. Classifying the surveillance type which need to be transparent.
3. For each surveillance type, to figure out the format or details should be disclosed

Session Length: 2 hours

Tracking and Mitigating the Impact of Automated Threats to Human Rights Online

Facilitator: Jon Penney and Joss Wright

Session Description: One significant emerging threat to human rights online is the increasing automation of legal/regulatory processes on the internet, including automated forms of online speech suppression and surveillance/tracking. Some examples deployed on private sector platforms include:

From AI twitter bots delivering threats to users to silence speech and promote self-censorship, to automated DMCA “copyright bots” sending millions of copyright takedown notices to internet users around the world every day, chilling online expression at scale, to machine learning algorithms designed to monitor, track, and data profile users for social/political/monetary exploitation; these developments pose complex challenges not only to freedom and security online, but to researchers seeking to understand and analyze their impact and activists aiming to counter them: What can be done to address these threats and harms?  Are there legal/regulatory solutions? Can systems be built to mitigate the impact of these automated processes?  Is greater transparency about these processes a solution?

These are some of the questions that will be tackled in this session, which will be divided into two parts. First, feedback will be sought in relation to a proposed experimental study designed to track, and mitigate, speech chill and other negative impacts associated with the automated enforcement of legal norms at a global scale under the Digital Millennium Copyright Act (DMCA). Second, participants will discuss concrete ideas, proposals, solutions, and research designs to explore for documenting, tracking, and studying these new processes/programs, and their impact. The session will also aim to identify and analyze legal, ethical, and methodological challenges for better research design/strategies.

Session Objectives: This session aims to: (1) obtain concrete feedback on an existing proposed study—including its design and methodology—to track/measure the negative impact / chilling effects of automated DMCA “bots” online on 100,000 Twitter users over the course of a year, and also build automated processes to mitigate those effects. This proposed study is a collaboration between the session leaders and other researchers at Princeton and is funded by MIT’s AI Ethics and Governance Research Fund; (2) document, categorize, and contextualize some concrete examples of private sector forms of automated processes and the potential threats/harms they pose; (3) discuss and document ideas, proposals, theoretical frameworks, research designs, ethical systems, transparency solutions, and data sets to understand these processes and new studies/research methods/designs to measure, explore, and understand the same.

The discussion of the three aims above will lay the foundation for future collaborations, identify new data sets, and discuss new case studies to analyze, measure, and explore these newly emerging automated processes online and their impact on human rights. The session will also aim to identify and analyze legal, ethical, and methodological challenges for better research design/strategies

Session Length: 1 hour

Holding Businesses Accountable for their Human Rights Abuses Abroad

Facilitator: Jon Penney

Session Description: How can private sector companies that violate human rights while operating internationally be better held accountable in these abuses either internationally or within their national or home jurisdictions where they reside? This is an important issue that often arises in the work of the Citizen Lab and others researchers and activists working at the intersection of ICT, human rights, and global security, where victims of international corporate human rights abuses—like victims of internet censorship or targeted surveillance online—often have no recourse in their own countries—due to corruption, lack of sufficient laws or independent courts, or the fact that the government itself is involved in abusive human rights activities. An example of this is the Citizen Lab’s recent Netweeper Report, which documented how a Canadian company’s internet filtering technology was being used to contribute to a range of human rights violations, including freedom of expression and non-discrimination. Yet, despite these clear violations, there is little existing recourse for victims in Canada or elsewhere.

Part of the challenge is that there is uncertainty within international law on this count as there is presently no international human rights obligation for states to police the international activities of businesses domiciled in their national jurisdictions. And existing international measures lack teeth.  Yet, states, including Canada, nevertheless have an international human rights obligation to protect human rights and provide effective remedies for victims of human rights violations, including those committed by Canadian companies operating abroad. Can a solution to this challenge be found?

Session Objectives: This session aims to analyze the different social, legal, regulatory, technical, and international dimensions of this challenge, while also canvassing/assessing different legal, regulatory, and technical solutions-- anything from the new statutory and legal remedies for victims, new liability regimes to incentivize better behavior, export controls, to existing and potentially new international mechanisms for recourse (from sanctions to OECD complaints). The discussion will be framed first with Canadian/British/American examples—like the new Canadian Ombudsperson for Responsible Enterprise (CORE) or how international law is applied within Canada’s domestic legal context—but the discussion is expected to be internationally oriented and all perspectives and national experiences are welcome.

Session Length: 1 hour

Evaluating Stalkerware

Facilitator: Adam Molnar

Session Description: For the past year, Deakin University and Citizen Lab researchers have been examining a select group of stalkerware applications. To date, preliminary technical analysis has been done on 2 applications, as well as analyses of 9 companies’ ‘branding materials’ and privacy policies and/or terms of service associated with their applications. Preliminary legal analyses in Canada and Australia have also been conducted.

Session Objectives: In this session, we will be: 1) eliciting feedback on the methods used to date; 2) asking participants to actively help us identify what they believe is notable in the data or research to date; 3) suggest revisions to existing methods that are being used for this project; and 4) based on interest by participants, conduct an analysis of one application that has not yet been analyzed. The session will be divided into a brief introduction to the project and its rationales, breaking into working groups to analyze different elements of the project (i.e. technical, legal, policy). We will reconvene to share insights from the breakout sessions, which will include a brief discussion about engaging external stakeholders to maximize the impact of the research.

Session Length: 3 hours

Armed conflicts and information operations

Facilitator: Ron Deibert and Ksenia Ermoshina

Session Description: The session draws upon research conducted by Citizen Lab on armed conflicts and information control. It welcomes participants working on methodological, ethical, and other challenges unique to doing research in that area. Several cases of ongoing conflicts will be discussed, such as Syria, Russia-Ukraine, and others. We will continue to collect hypotheses and literature suggestions, as well as research questions and empirical insights in order to contribute to a longer-term dedicated research project. Among the questions we want to adress: How do people communicate in zones of armed conflict that is different from other situations? What are the specifics of Internet censorship and shutdowns in the context of armed conflicts? How do disinformation, social media, and changes in accessibility affect conflict documentation? Which roles do disinformation and social media play in the outbreak of armed conflict versus what roles do those factors play *during* armed conflict? How does armed conflict transform the communication infrastructure and ISP/telco industry? How does armed conflict act as accelerator to deploy alternative forms of networks less vulnerable to attacks?

Session Objectives:

* Discuss with researchers, activists, trainers, collect empirical insights from the frontlines;
* Identify main related methodological and ethical challenges;
* Identify key research questions and hypotheses;
* Contribute to a collective bibliography (academic readings and press articles on the topic)

Session Length: 2 hours

Mapping Digital GULAG

Facilitator: Alexander Isavnin

Session Description: Everything now becomes "Digital". Digital Culture, Digital Economy, Digital Platforms. Regimes studying well, how to use digital new world to perform repressions. Understanding state persecutions of freedoms and rights, states abuse of technologies and the Internet becomes really important objective.

Session Objectives: Finding methods to detect, map, communicate and resist to digital repressions.

Session Length: 1 hour

Information Controls in the CIS: showcasing OTF ICFP research projects

Facilitator: Sergei Hovyadiov, Igor Valentovich, Ksenia Ermoshina, Arzu Geybullayeva

Session Description: The session brings together 2017-2018 Open Technology Fund Information Controls research fellows, working on the information control in the CIS region. We're going to present intermediary findings from our research, and welcome contributions from other participants with expertise in network measurements, censorship studies, surveillance studies, CIS area, social and political science and law. First of all, we'll focus on our areas of interest: Azerbaijan, Russia, Ukraine (and Crimea), Belarus and Kazakhstan, then we'll provide a comparative perspective on the CIS specifics.

Session Objectives: Showcase work in progress and receive precious feedback from the CLSI expert community;

Discuss a set of methodological questions such as:
- how can qualitative ethnographic methods (such as interviews) be coupled with network measurements? how to improve collaboration between future ICF social science fellows and technicians?
- how can we use network measurements data to document censorship during political events on a transnational level?
- how can we (better) combine different network measurement projects (e.g. OONI, Netblocks etc...)

- how can we detect such phenomena as upstream filtering?
- what other sources than network measurements can we use for information control research? (e.g. RIPE and BGP archives...)

Session Length: 2 hours

Protecting OFF line Human Right Defenders

Facilitator: Ariel Barbosa

Session Description: Presentation of the Digital Security School in Colombia and share knowledge around the following questions: What are the challenges to ensure the physical security of human rights defenders who are just beginning to use online tools? How to take advantage of the benefits of ICT to carry out their work without risking their lives and that of their co-workers? What is the most risky situation to which human rights defenders in Colombia are exposed?

Session Objectives:

* Define digital literacy guidelines that take into account the risks and threats to HHRR Defenders are exposed when using the internet

* Evidence the situation faced by human rights defenders who are just beginning to use ICT
* Evidence the double vulnerability faced by women leaders: for being a woman and for being an activist

Session Length: 1 hour

Journalists in exile:  risk perceptions, threat-models and professional transformations of “exiled media”

Facilitator: Ksenia, Lobsang, and Masashi

Session Description: This session brings together researchers and practitioners to discuss a particular phenomenon - that of exiled media - when journalists (and sometimes entire media organizations) have to leave their country, as a consequence of political pressure and threat or of the censorship imposed by their governments. How does exile impact journalists threat-models, perception of risks, security and insecurity? How does exile change communication between journalists and their sources? How does exile challenge usual distinctions between staff journalists, freelancers and “amateurs” / “civic journalists”?

For this session we will mostly focus on two case studies - that of Tibet and Crimea - both territories being “disputed" or "occupied areas” where political pressure, Internet and media censorship seriously impact traditional media structures. We will share some of our work in progress on these areas, based on interviews with journalists and network measurements. However, the session aims to go beyond these two case-studies, and is designed to include insights from practitioners (journalists, digital security trainers, NGO workers) focused on other high-risk areas, as well as from researchers who have worked with similar questions and faced similar methodological challenges.

Session Objectives: The session will be useful for several groups: journalists, digital security trainers, social science researchers, NGO workers focused on freedom of expression, as well as researchers in international relations and political science.

During the session we plan to:

* Identify key methodological and ethical challenges that researchers face when studying media in disputed areas or high-risk areas;
* Discuss relevant literature, identify gaps and unaddressed questions;
* Identify key threat-models and risk perceptions shared by exiled journalists;
* Based on insights from different areas, sketch a roadmap for security tips, think of tools that journalists in exile may need, identify gaps or unaddressed needs of these communities;
* Build connections between digital security trainers, NGOs and researchers present at the session for further collaboration on selected areas.

Session Length: 1 hour

Purpose and Ethics in Malware Disclosures

Facilitator: Matt Brooks

Session Description: There are multiple reasons in the public’s interest to disclose information about surveillance technologies and targeted intrusion attempts. Examples include publicizing technologies being sold to governments with poor human rights’ records as well as the targeting of non-violent minority groups and civil society organizations. Arguments have also been made against disclosures, often by researchers not wanting to interfere with law enforcement investigations or cause an actor to change behavior. This session will explore the purposes and desired outcomes for malware and surveillance disclosures as well as ethical issues to consider for publication.

Session Objectives: One desired outcome of the session is a field survey to gather additional perspectives and data for analysis. Another outcome may be a public interest statement to include when disclosing information about surveillance technologies and targeted intrusion attempts. Other outcomes may be identified during the session as well.

Session Length: 1 hour

Citizens and Surveillance: Understanding and measuring self-censorship in quasi-democratic regimes

Facilitator: Chris Gore

Session Description: Quasi-democracies are regimes that hold both democratic characteristics and authoritarian tendencies. These regimes tend to appear to be stable, with regular elections, but with leaders or practices that have authoritarian tendencies, such as violations or abuses of civil liberties, or electoral practices that are not deemed to be competitive. In sub-Saharan Africa, many states fall under this category of quasi-democracy. They have regular, competitive elections, but elections tend to have characteristics that make it very difficult for a non-incumbent to have any potential for victory. In these regimes, there are rules, laws and practices, that superficially offer rights to citizens and activists, but these are often manipulated or ignored or changed to strengthen the state. Changes to laws and practices relating to freedom of expression and the arrest of citizens or activists articulating dissenting or critical views of government online are common in these regimes.

In this session, the goal will be to review how citizens and activists are changing or do change their use of online tools and practices as a result of real or anticipated surveillance practices in semi-democratic states, and reflect on ways that self-censorship may be evaluated systematically over time.

Session Objectives:

 I envision 5 objectives for this session:

1) to learn what evidence currently exists about citizens self-censoring their online behaviour;

2) to understand the real or potential impacts of self-censorship on quasi-democratic regimes, in relation to political, social, and economic behaviour or activities;

3) to consider the characteristics of citizens that may be more or less likely to self-censor, including the technical network characteristics and online tools that might lead to self-censorship;

4) to examine approaches or methods of measuring and monitoring how and if citizens self-censor their online behaviour in these regimes;

5) to discuss opportunities for a collaborative single or multi-country comparative study of self-censorship, looking towards a submission to a granting agency in the next one to two years.

Session Length: 1 hour

Status code 451
Facilitator:
Mallory Knodel

Session Description: Improving HTTP error/status code 451 as an instrument for increasing transparency around Internet censorship. Session will review work to date at the IETF: RFC, new I-Ds, peripheral projects and apps; and elsewhere on censorship and blocking. Discussion on making the tool more useful and next steps.

Session Objectives: We want to get people engaged in the existing work and come up with new and exciting work.

Session Length: 1 hour

An End to End Analysis of VPN Services

Facilitator: William Tolley, Zack Weinberg and Mohammad Taha Khan

Session Description: In this section we’ll be comprehensively talking about the VPN ecosystem. The main aspects that will be covered are:
* Investigation of the security and privacy claims of VPN services.
* The accuracy of their geolocation points and their false advertisement.
*  Active low level attacks on VPNs which leak user information and browsing habits.

Session Objectives: * Feedback on generally improving our investigative methodologies.
* Discussion methods of publicizing the results beyond academic venues.
* Interested interested collaborators.
* Means of involving the general population and discovering ways of establishing continual evaluation of VPNs.

Session Length: 1 hour

Right of Access, proposals for empowerment

Facilitator: Rene Mahieu and Andrew Hilts

Session Description: Recently there have been many initiatives around using Data Access Requests. The goals of these projects range from making citizens aware about how their personal data is being abundantly processed, to increasing the general transparency about these processes, and also to showing the inadequacy in the current transparency.

 

At Citizen Lab Summer Institute, we have the chance to bring many of the people who have been active in this field together with each other, but also with those who are working on transparency, surveillance and privacy in other ways. In this session we would like present some of the findings so far and start a brainstorming process on how we can further use the right of access as a tool for advocacy, academia and activism.

Session Objectives:

* Find ways in which the collective use of the right of access can be used for purposes of the community by brainstorming about the potential uses of the right of access, for advocacy, and academia.

* Mapping who is using access requests, and what can we learn from each other.

* Giving people who have no experience with the right of access some insight in how it works by presenting some examples and practices.

Session Length: 1 hour

Implications of "Going Dark" in Developing Countries

Facilitator: Sunoo Park (MIT)

Session Description: The "Going Dark" debate concerns the tension between law enforcement capabilities and the prevalence of encryption that renders data and messages difficult for law enforcement to access. The FBI has claimed that the increasing use of encryption by tech companies is impeding its ability to enforce the law, and has pushed for solutions that grant "exceptional" FBI access to encrypted data. On the other side, technologists (incl. cryptography experts) and civil liberties advocates, among others, have argued that the introduction of such workarounds would fundamentally compromise the security of the systems.

We want to explore the implications of increasingly widespread and by-default encryption in developing countries, whose perspectives are seldom voiced in these heated debates. Though lacking the political and technological clout to raise their voices in the debates, the prospects of intelligence gathering in developing countries are very relevant concerns both in the sense that their circumstances are essentially different from the developed countries which are the subject of primary discussion, and that many developing countries have large and fast growing populations with rapidly accelerating smartphone penetration rates.

Session Objectives:

* Obtain feedback for a working paper, especially from perspectives of different countries and legal expertise.

Session Length: 1 hour