1 of 47

Understanding the Online Child Safety & CSEA Landscape

2 of 47

2

RLB Safeguarding Ltd Consultancy & Training

Intro:

UK National Crime Agency – Dedicated Online CSEA Unit

Child Safety Lead – Twitter (X)

Head of Online Child Safety Unit – Cyber Threat Intel

Consultant, Advisor, Associate, Trainer

Trust & Safety Professional

Trainer – T&S Academies

Intelligence Research

Andy Briercliffe

Global Online Harms Professional

Online Child Safety Advisor

3 of 47

CONTENT AWARENESS MESSAGE��You will NOT see any harmful content��However;��We will be discussing topics relating to Online Child Sexual content��IF at any stage you feel uncomfortable, please step out and let someone know��“It’s OK not to be OK”

4 of 47

��Understanding the Global Landscape��Education, Online Harms and Trust & Safety��Online Child Safety� cross overs, what we mean exactly��What is CSEA (Child Sexual Exploitation & Abuse)� CSAM� CSEM� Platform Policy��Detection & Moderation� Signals/Trends� Detection avoidance��Laws and Regulations��Reporting to relevant Authorities��LEA Response��Who does what, where and how� Awareness of Stakeholders

AGENDA

5 of 47

Today’s ‘digital’ global landscape

EVERYTHING available 24/7 365 across the world

6 of 47

Online Harms Landscape

Online Safety

Trust and Safety

Online Harms

Education

Tech Development

Tech Detection

Content Moderation

Research

Academia

Government

Regulation

Training

Law Enforcement

Age Verification

NGOs

Survivor Groups

Mental Health

Advocacy

Cyber Bullying

Hate & Abuse

Intimate Image Abuse

Child Sexual Abuse Content

Scams/Fraud

Child Exploitation

VAWG

Challenges

Harmful but legal

Terrorism/Extremism

Human Trafficking

Misinformation

Cyber flashing

Deepfakes

Gen AI

(not exhaustive lists)

Platforms

Financial Institutions

Travel Sector

Sport Landscape

Music Industry

Film Industry

Shopping/Retail

Gaming

News Media

Intelligence Research

7 of 47

Education, Online Harms and Trust & Safety

Education is key to Online Safety

Not just Children, but Parents and Adults

Understanding the risks, the dangers, the reporting process

To educate about the Online Harms and processes

Leads to response by Platform T&S teams

Policies, Products, Legal and Operations

8 of 47

The Global Landscape

It’s complex !

We have a global society online with their own views,

thoughts, attitudes and ‘interests’

  • Country Laws
  • Ages of Consent
  • Platform Policies
  • Topic Misinformation
  • Culture & Diversity
  • Attitudes & Behaviours
  • Terminologies

9 of 47

New laws are being introduced which means ‘platforms’ must assess the risk to children and take appropriate action:��* Safety By Design��global-csam-legislative-overview-2024-full-report-1.pdf����

Global Laws &

Regulations

10 of 47

11 of 47

Age Verification &�Assurance

New Laws around the world have imposed Age Restricted access in some form:

Australia – Social Media Ban for Under 16s

France, India, New Zealand – under discussion

UK – Age Verification introduced for adult pornography sites

Social Media also introducing age checks

Users just now using VPNs !

12 of 47

What is Online Child Safety

Terminology:

Child Protection, Child Safeguarding, Minor Safety

  • NCMEC
    • National Centre for Missing & Exploited Children
    • Get Help Now
  • InHope Classification Schema
    • Available only from InHope
  • ECPAT Terminology Guidelines

13 of 47

��What’s your understanding of Online CSEA ?

QUESTION ?

14 of 47

Child Sexual Exploitation & Abuse (CSEA)

Includes:

  • Child Sexual Abuse Material (CSAM)

  • Child Sexual Exploitation Material (CSEM)

  • Individual Platform Policies

  • Note: “Child Pornography” (CP)

Although still legal term, advised not to use

15 of 47

The landscape of Terminology

Cyber Located Sexual Violence

Tech Facilitated Sexual Abuse

Tech Facilitated Sexual Violence

Tech Facilitated Financial Extortion

Tech Facilitated Blackmail

Sexploitation

Sexual Extortion

Tech Facilitated Child Abuse

Cyber Located Sexual Violence

Financial Sexual Extortion

Financially Motivated Sextortion of Minors

Online Grooming Solicitation

Criminal CSAM

CSEAM

Financial Grooming

Webcam Blackmail

Child Criminal Exploitation

(CCE)

Tech Assisted CSEA

Online CSEA

16 of 47

��Individual platforms have their own policies about what is/isn’t allowed��Internal & external monitoring & reporting procedures��The type of platform can depend on what is allowed��Some Countries implement stricter regulations on certain content

PLATFORM

POLICIES

17 of 47

����Refers to sexualised content depicting minors that is exploitative in nature but does not fall within the classification of nationally illegal child sexual abuse material (CSAM). ��It can also include non-illegal images in a series with CSAM as exploitation material, due to its investigative relevance and the context of exploitation in which it was generated��CSEM is not reportable content

Child Sexual

Exploitation Material

(CSEM)

18 of 47

Child Sexual Abuse Material (CSAM) has different legal definitions in different countries. ��The minimum defines CSAM as imagery or videos which show a person who is a child and engaged in or is depicted as being engaged in explicit sexual activity.��Child defined as anyone under 18��UK Categorisation��Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal or sadism.�Category B: Images involving non-penetrative sexual activity.�Category C: Other indecent images not falling within categories A or B. 

Child Sexual

Abuse Material

(CSAM)

19 of 47

Wikimedia Foundation Reminder

Prohibited content includes (but is not limited to):

Photographs, drawings, renderings, or videos of minors depicting:

Nude body with a focus on genital areas, where that image has no

obvious educational value aligned with the projects' purpose

Implicit or explicit sexual intercourse with or in proximity of a minor

Simulated sexual activity (including if the minor is fully clothed)

Masturbation

A minor depicted with or engaging in sexual activity with sex toys

Other "sexually explicit conduct" as defined in United States law

Sharing links to (or purporting to be to) such material is also in violation

of this policy. Other material may be removed by the Foundation at the

discretion of the Legal department as a primary office action pursuant to the 

Terms of Use.

20 of 47

What can be OCSEA�Examples

Self Generated (selfies) Nudism

Offender Networks Inappropriate Teen “chat” Sextortion Cartoon

Grooming Humour

Viral Manga/Anime

Family pics Upskirting

URL Sharing Outrage

AI/Photo shop Glorification

Deepfakes Non CSAM imagery

Victimisation ‘Fantasy’

Detection Avoidance media

21 of 47

Detection Avoidance

22 of 47

General International Law

Bad actors ‘test the system’

They are aware of PDNA (hash matching) *

* Hashing – the use of media ‘fingerprinting technology’

Edited images (non violative)

‘Innocent’ emoji use

Open ended questions/comments

we have the same interests

Wording can be ‘subjective’ / innocent

are you alone?”

23 of 47

What indicators can help you

Indicators

  • Style of language

Consider slang/emoji use

  • Subject content

Different ages have different interests

Music, Film, TV, Fashion etc

References to school/education settings

  • Body additions

Tattoos, Piercings, body aesthetics

  • Age References

Driving, smoking, medical

  • Background Info

Rooms, noise, pictures

24 of 47

Although still under discussion/confirmation��Generally accepted that any CSEA content created by AI is classed/assessed the same as non-AI generated content

Artificial Intelligence

Generated (AI)

Content

25 of 47

General International Law

Child is anyone under 18 (no matter which Country)

Indication of sexualised content

Encouragement, Promotion of CSEA content

Criminal Offence of:

Possession, Distribution, Creation of CSEA content

26 of 47

CSAM Classification

Different Countries use different types of classification:

Examples:

  • US

  • UK

Cat A, Cat B , Cat C

  • Australia

Class 1, Class 2, Class 2A, Class 2B

27 of 47

Classifying Images

You are not required to assess images to an

accurate classification – just awareness

Only Law Enforcement and other Agencies are legally

allowed to assess the correct classification

However, you must be aware of when priority action is

required:

  • Threat to life and/or possible harm
  • Planned meeting with a child
  • Unknown image(s) identified

28 of 47

The Copine Scale

The COPINE Scale

1

Indicative

Non-erotic and non-sexualized pictures showing children wearing either underwear or swimsuits from either commercial sources or family albums. Pictures of children playing in normal settings, in which the context or organization of pictures by the collector indicates inappropriateness.

2

Nudist

Pictures of naked or semi-naked children in appropriate nudist settings, and from legitimate sources.

3

Erotic

Surreptitiously taken photographs of children in play areas or other safe environments showing either underwear or varying degrees of nakedness.

4

Posing

Deliberately posed pictures of children fully clothed, partially clothed or naked (where the amount, context and organization suggests sexual interest).

5

Erotic Posing

Deliberately posed pictures of fully, partially clothed or naked children in sexualized or provocative poses.

6

Explicit Erotic Posing

Pictures emphasizing genital areas, where the child is either naked, partially clothed or fully clothed.

7

Explicit Sexual Activity

Pictures that depict touching, mutual and self-masturbation, oral sex and intercourse by a child, not involving an adult.

8

Assault

Pictures of children being subject to a sexual assault, involving digital touching, involving an adult.

9

Gross Assault

Grossly obscene pictures of sexual assault, involving penetrative sex, masturbation or oral sex, involving an adult.

10

Sadistic/Bestiality

a. Pictures showing a child being tied, bound, beaten, whipped or otherwise subject to something that implies pain.�b. Pictures where an animal is involved in some form of sexual behavior with a child.

29 of 47

Tanner Scale

30 of 47

Tanner Stages in Females

Stage

Age

Puberty Changes

Stage 1

8-10

Vellos hair, a type of fuzzy hair, may appear in the pubic area. Typically no real breast development yet.

Stage 2

10-11

Thin, straight hair primarily at the labia. Breast buds (small mounds, larger nipples) appear. Peak growth stage.

Stage 3

12

Darker, more coarse and curly hair may appear. Breast enlargement continues. Peak growth stage.

Stage 4 

13

Hair similar to an adult but there's not as much. Nipples develop above breast tissue mounds. Menstrual periods typically begin, vaginal discharge may appear first.

Stage 5

14-16

Hair similar to adult in triangular pattern. Mature breasts have developed. Wider hips are common.

31 of 47

Tanner Stages in Males

Stage

Age

Puberty Changes

Stage 1

10

Minimal vellos-type hair grows in pubic area, similar to abdomen. Penis, scrotum, and testes are same size as in childhood

Stage 2

11-12

Thin, straight hairs grow at base of penis. Testes enlarge, skin texture and/or color changes on a bigger scrotum.

Stage 3

13-14

Darker, more coarse and curly hair grows at the genitals. Penis length (and girth) grows, testes size continues to enlarge. Sperm production can begin.

Stage 4 

13-14

Pubic hair is similar to an adult's but there's not as much. Penis is notable larger, visible changes to the glans. Peak growth stage.

Stage 5

15-17

Hair is similar to an adult's in the triangular genital pattern. Reproductive organs are same size as an adult's.

32 of 47

The UK….��We now have the Online Safety Act��130 topics classed as “Online Harms” – this means ‘illegal’��‘Priority’ topics:��Terrorism.�Harassment, stalking, threats & abuse offences.�Coercive & controlling behaviour.�Hate offences.�Intimate image abuse.�Extreme pornography.�Child sexual exploitation & abuse.�Sexual exploitation of adults.

The UK has very recently introduced age verification for adult pornography sites

33 of 47

- Sextortion - Online blackmail where individual threaten to share sexual pictures, videos, or information. They may be trying to take money or other forms of financial payment or forcing the victim to do something else.��- AI - A set of technologies that enable computers to perform a variety of advanced functions, including the ability to see, understand & translate spoken & written language, analyse data, make recommendations, & more.��- COM Networks (764 and affiliates) – This ‘group’ share, discuss, post and take part (IRL) in the most harmful acts. Covers self harm/suicide encouragement, CSEA, extremism, violence, scam, abuse etc….

Current High Concerns:

Every area under CSEA is

concerning.

3 are prominent concern

34 of 47

AI/GenAI

Explosion in AI generated content: images, videos, text and voice

It is becoming more and more sophisticated

Teenagers use it – ‘nudify’ apps

Offenders use it – creation of new/manipulated content

Prompts!

Positive Negative

Child Underage

Swimming Lolita

Beach Naked

Blonde hair Posing

Slim Legs open

35 of 47

Grooming/Sextortion

However…..

The basics of ‘online grooming’ is usually LEGAL

No offence of older person chatting to a child

Build trust – therefore initially ‘generic’ innocent chat

How is school

What music do you like

Do you go out with friends

Only becomes a violation when becomes sexual

36 of 47

The invisible risk

The Brain Function

Media v Text

How the brain can interpret

what we read

37 of 47

Say what you see…

38 of 47

Draw what you think…

A House

With a door

2 windows

A chimney

39 of 47

Reporting & LEA Response

Potential Platform Violations: legal-reports@wikimedia.org

Reporting potential violations: Wikimedia Trust and Safety Team: ca@wikimedia.org

Wikimedia

NCMEC

National Law Enforcement

Local Force

40 of 47

Your Mental Health and Welfare

41 of 47

The risks;

  • Burn out
  • Stress
  • Physical and mental wellbeing
  • Relationships
  • De-sensitivity
  • A different perspective on the world
  • On edge / Anger

42 of 47

Invisible risks

Exploring the work of Content Moderators

Coping..

  • ‘One size’ coping mechanisms do not fit everyone
  • What effects 1 person, may not effect another
  • Each have individual ‘triggers’ - you may not know until it happens
  • There is 1 coping strategy that everyone can do - talk
  • Who really understands ?

43 of 47

Coping…

  • What suits us best as individuals..
    • Pets
    • Gym / Exercise
    • Reading
    • Yoga
    • Gaming

Just to name a few examples

44 of 47

Ongoing Risk…

  • What happens when we just STOP
  • Who can you turn to ?
  • The only people who really understand are people from the ‘job’
  • Do we consider PTSD ? Like soldiers / Firefighters etc ?
  • Aftercare ?

45 of 47

Further Guidance and Support

  • Welcome to the Tech Coalition

  • iwf.org.uk

46 of 47

Further Guidance and Support

47 of 47

Thank you for your time today��Any questions��Feel free to ask after, reach out