1 of 31

W3C Accessibility 4 Children 09.06.2022

2 of 31

1 Introduction

The Child Online Safety Toolkit contains:

  • 5 things for policymakers to implement child online safety into law and practice. �
  • 10 Policy Action Areas + detailed roadmaps + key practical steps

  • A model policy that policymakers can adopt and adapt for country policies

  • Downloadable worksheets to create a policy fit for practice�
  • Roadmap for governments, nation-states and organisations to �build, review or improve their policies & practices with respect to children’s rights

3 of 31

2 Guaranteeing online safety

�not just responding to risks and harms��Actively designing a digital environment that is safe for every child

With 1 in 3 people online under the age of 18

the centrality of digital technology in children’s lives means ��it must be formed with their privacy, safety and rights by design and by default��=Preventative and holistic approach �

4 of 31

3 Foundational Resources

5 of 31

4 Principles/ Rights

1. Non discrimination�3. Best interests�6. right to survival & development�12. Right to be heard

6 of 31

5 “things”

• Risk and harm�• Accessibility and inclusion�• Chain of responsibility

• Child-centred design �• Effectiveness

7 of 31

6 Identifying risk and mitigating harm

� The CO:RE 4Cs classification recognises online risks arise when a child:

• Engages with and/or is exposed to potentially harmful content �

• Experiences and/or is targeted by potentially harmful contact ��• Witnesses, participates in and/or is a victim of potentially harmful conduct �

• Is party to and/or exploited by a potentially harmful contract.

8 of 31

7 Promoting access, accessibility and inclusion

� Child online safety materials should be:

  • developed in consultation w/ children & parents/carers
  • age-appropriate
  • gender-neutral
  • easily accessible to children of different ages and their parents/carers��use :��visual materials where literacy is limited��consistent terms across platforms �to make child online safety more easily understandable and accessible

9 of 31

8 Lundy Model on child participation

10 of 31

9 Comments

  • Mention of accessibility and inclusion but not the guidelines nor tools�
  • Accessibility questions are relevant for safety and very much related to education

-To be checked for the whitepaper-�

  • Child-friendly and accessible version of the toolkit?

  • Importance of privacy, cooperation, supportive tools,

11 of 31

10 MODEL POLICIES

  1. Institutional capacity
  2. Legal and regulatory frameworks
  3. Personal data, identity and autonomy
  4. Response and support systems
  5. Corporate responsibility
  6. Training
  7. Education
  8. Public awareness and communications
  9. Research and development
  10. Global cooperation

12 of 31

Quotes

“While the real world can restrict the way people can express themselves,� the digital world should never restrict expression.” – New Zealand, 17

“Even though your privacy and security settings are on,

companies/people have advanced techniques to get what they want to know.

They likely know more than us and can access more than we know was even

available online.” – Canada, 13��“We need to reform social networks to enable children living with disabilities.

We need to simplify...the use of the internet in sign language so we, the deaf, �can be able to use the digital platforms.” – Tanzania, 15�“I wish that children [in rural areas] can enjoy the benefits of the internet� just like the others.” – Malaysia, 13

13 of 31

3 steps for each policy:�All policies in next slides

1. Model policy text

�2. Roadmap to achieving the objective

�3. Supportive tools

14 of 31

1. a. Institutional capacity

“States parties should take legislative and administrative measures

to protect children from violence in the digital environment,

including the regular review, updating and enforcement

of robust legislative, regulatory and institutional frameworks

that protect children from recognised and emerging risks

of all forms of violence in the digital environment.”

Source: General comment No. 25 (2021), para 8274

15 of 31

1. b. Model policy text +Roadmap

��Establish a National Child Online Safety Steering Committee��SWOT template to identify the most appropriate lead department or ministry

Template to identify stakeholders

� Checklist to identify which policy areas address child online safety

16 of 31

2. Legal and regulatory frameworks

legal frameworks that do not make it clear �whether self-generated sexual images that are exchanged on consensual basis �between children will be considered illegal Child Sexual Abuse Material. ��Even if children are not prosecuted in practice this legal uncertainty with potential

criminalization may undermine trust, control and autonomy rights.

Strengthen and enforce laws that prohibit child online safety-related offences.��1. A legal checklist: An example

2. A blank legal checklist to complete

17 of 31

3. a. Personal data, identity and autonomy

Visuals to help understand the breadth and volume of children’s data collected

18 of 31

3.b. AI-powered voice assistants and chatbots

utilise NLP (Natural Language Processing)��the use of chatbots can lead to additional risks for children, (especially in mental health)

when bots do not recognise appeals for help�provide inadequate advice��Facial recognition for biometric identification p.83�Facial recognition systems employ

computer vision techniques �and machine learning algorithms �to determine, process and analyse a person’s facial features �with a wide range of aims, such as verifying an individual’s identity against an existing record. ��For identification purposes, it may be used in border management, crime analysis and prevention, and school surveillance for claimed reasons of improved security.

Facial recognition is increasingly being used as a means of a digital identity “credential” for both legal and functional identification. While not a replacement for legal ID, which makes people visible to a State and is a recognised right, this technology may more quickly or easily validate an existing identity record.

19 of 31

3. c. Other documents

2. Case Study of the UK’s Age -Appropriate Design Code141�3. UNICEF’s Manifesto for the Better Governance of Children’s Data142�4. OECD Privacy Framework143�5. European Union General Data Protection Regulation (text and tools)144�6. The European Digital Strategy (including proposals on Artificial Intelligence and Data)145�7. Irish Data Protection Commissioner Fundamentals for a Child Oriented Approach to Data Processing (The Fundamentals)146�8. UNICEF Memo on Artificial Intelligence and Children’s Rights147�9. EU Fundamental Rights Agency Report – Under Watchful Eyes: Biometrics, EU IT Systems and Fundamental Rights148144.�10.General Data Protection Regulation, European Union, 2018. 145. �11. Shaping Europe’s digital future, European Union, 2020. 146. �12. Fundamentals for a Child-Oriented Approach to Data Processing, Ireland Data Protection Commission, 2020. 147. �13. Artificial Intelligence and Children’s Rights, United Nations Children’s Fund, 2019. 148. �Under watchful eyes: biometrics, EU IT systems and fundamental rights, European Union Agency for Fundamental Rights, 2018

20 of 31

4.Response and support systems

States parties should provide children with child-sensitive and age-appropriate information in child-friendly language on their rights and on the reporting and complaint mechanisms, services and remedies available to them in cases where their rights in relation to the digital environment are violated or abused. Such information should also be provided to parents, caregivers and professionals working with and for children. Source: General comment No. 25 (2021)��Checklist to develop adequate notice and takedown procedures�Australia’s Bill for an Online Safety Act177

21 of 31

5 Corporate responsibility p.106

�Consistent community rules across platforms

Clear timelines for reporting

Ability to tell victims what happened to their bully

Better content labelling

Easy ways to take content down

A ban on spreading abuse

Pop-ups that point to good behaviours and encourage use of high-privacy settings

A clear end point to complaint processes

Policies written in plain language

22 of 31

6. Training

�Child online safety sessions should form a mandatory part of teaching, social work, health work, psychology, and other relevant degree programmes in public and private universities or education institutions. �

Supportive tools�1. checklist of professions that may benefit from specific training, and topics to address�1. Resources for teachers, social workers and youth workers See NSPCC Online Safety Training�2. Resources for healthcare professionals e-Integrity’s online child protection training for healthcare professionals (Safeguarding Children and Young People)�3. Resources for law enforcement227

ICMEC provides a range of different training opportunities and courses such as: �• Essentials of Technology-Facilitated Crimes Against Children

• Advanced Online Exploitation Investigations

• Advanced Technologies ��• Fundamentals of Responding to Missing Children.�4. Gender-Based Violence in Namibia: An exploratory assessment and mapping of GBV response services in Windhoek, 2016228�5. Resources for the justice sector229 The Council of Europe’s EndOCSEA project provides training for judges and prosecutors on CSEA.

6. General resources The Queensland Family and Child Commission has produced a Protecting Children Online Module.230

23 of 31

7. a. Education

States parties should provide and support the creation of age-appropriate and empowering digital content for children in accordance with children’s evolving capacities and ensure that children have access to a wide diversity of information, including information held by public bodies, about culture, sports, the arts, health, civil and political affairs and children’s rights. �States parties should encourage the production and dissemination of such content using multiple formats and from a plurality of national and international sources, including news media, broadcasters, museums, libraries and educational, scientific and cultural organizations. ��They should particularly endeavour to enhance the provision of diverse, accessible and beneficial content for children with disabilities and children belonging to ethnic, linguistic, indigenous and other minority groups. The ability to access relevant information, in the languages that children understand, can have a significant positive impact on equality. �

Source: General comment No. 25 (2021) paras 51 and 52231

24 of 31

7.b. Education: policy

7a. Designate child protection leads�7b. Promote accessible digital education�Promote content, including peer-to-peer programmes, that are designed and shown to help children develop digital skills and empower children to build respectful communities that support child online safety. Digital education should be holistic and should cover data and media literacy, alongside safeguarding issues – in particular issues of sexuality and consent. Education should also be extended to parents/carers to support their role in promoting child online safety. �7c. Promote educational content�7d. Promote data literacy�7e. Promote critical thinking�7f. Introduce formal child online safety procedures in schools

25 of 31

7. c. Education: Road map

1. Ensure digital literacy covers all online experiences, not just safety issues�2. Ensure that sex education, sexuality and consent are taught in the context of the digital world, to ensure that children have maximum agency over related issues that may come up online. �3. Identify digital literacy programmes in an appropriate language or that are available for translation as needed.235 �4. Ensure digital literacy for parents/carers is fully aligned with digital literacy for children. Parental resources should be positive and well-rounded and not cause undue panic about the digital world or encourage drastic measures against children.236 �5. Check for technology companies offering free digital literacy programmes for both children and adults. These are often very well developed and very effective – but fail to identify the commercial risks and harms of the tech itself.

 

26 of 31

7.d. Education Supportive tools

Checklist to ensure formal child online safety procedures in schools p.126�Other resources:

1.DQ Child Digital Readiness Kit: 8-Day Home-Based e-Learning for Children (ages 8-12) and Parents240�2. South Africa child online safety241�3. National Association of School Psychologists – A Framework for Safe and Successful Schools242�4. International Task Force on Child Protection – International Child Protection Standards and Expectations243�5. Council of International Schools244�6. NSPCC Online safety training in English245��

27 of 31

8. Public awareness and communications P.131

�policy text�8a. Generate a public awareness programme�8b. Provide accessible information and educational materials�8c. Raise awareness of child online safety in the media

8d. Engage parents/carers and children in discussions about child online safety�

Roadmap to achieving the policy:�

Supportive tools:�1. Checklist to ensure awareness programme is comprehensive�Group/audience Core message to reach them

��

28 of 31

9.a.Research and development P.145

Regularly updated data and research are crucial to understanding the implications of the digital environment for children’s lives, evaluating its impact on their rights and assessing the effectiveness of State interventions. �States parties should ensure the collection of robust, comprehensive data that is adequately resourced and that data are disaggregated by age, sex, disability, geographical location, ethnic and national origin and socioeconomic background. Such data and research, including research conducted with and by children, should inform legislation, policy and practice and should be available in the public domain.�Data collection and research relating to children’s digital lives must respect their privacy and meet the highest ethical standards. �Source: General comment No. 25 (2021), para 30272

��

29 of 31

9.b. Research and development: policy

�9a. Establish child online safety research frameworks�9b. Continued innovation�9c. Establish centres of excellence in research and development in child online safety

9d. Establish strong ethical frameworks for research and development in child online safety 273�9e. Establish frameworks for information gathering�9f. Enable access to private companies’ data in the public interest�9g. Ensure that data and statistics are relevant to the context

��

30 of 31

10. Global Cooperation

�10a. Establish formal relationship frameworks (e.g. a Memorandum of Understanding [MoU]) with regional and global child online safety communities�10b. Sign up to regional and international legal instruments that promote cooperation on child online safety�10c. Identify partner countries and organisations that can provide appropriate models and support for child online safety development�10d. Support other countries developing child online safety policies

��

31 of 31

Thank you��