Understanding the Online Child Safety & CSEA Landscape
2
RLB Safeguarding Ltd Consultancy & Training
Intro:
UK National Crime Agency – Dedicated Online CSEA Unit
Child Safety Lead – Twitter (X)
Head of Online Child Safety Unit – Cyber Threat Intel
Consultant, Advisor, Associate, Trainer
Trust & Safety Professional
Trainer – T&S Academies
Intelligence Research�
Andy Briercliffe
Global Online Harms Professional
Online Child Safety Advisor
CONTENT AWARENESS MESSAGE��You will NOT see any harmful content��However;��We will be discussing topics relating to Online Child Sexual content��IF at any stage you feel uncomfortable, please step out and let someone know��“It’s OK not to be OK”
����Understanding the Global Landscape��Education, Online Harms and Trust & Safety��Online Child Safety� cross overs, what we mean exactly��What is CSEA (Child Sexual Exploitation & Abuse)� CSAM� CSEM� Platform Policy��Detection & Moderation� Signals/Trends� Detection avoidance��Laws and Regulations��Reporting to relevant Authorities��LEA Response��Who does what, where and how� Awareness of Stakeholders
AGENDA
Today’s ‘digital’ global landscape
EVERYTHING available 24/7 365 across the world
Online Harms Landscape
Online Safety
Trust and Safety
Online Harms
Education
Tech Development
Tech Detection
Content Moderation
Research
Academia
Government
Regulation
Training
Law Enforcement
Age Verification
NGOs
Survivor Groups
Mental Health
Advocacy
Cyber Bullying
Hate & Abuse
Intimate Image Abuse
Child Sexual Abuse Content
Scams/Fraud
Child Exploitation
VAWG
Challenges
Harmful but legal
Terrorism/Extremism
Human Trafficking
Misinformation
Cyber flashing
Deepfakes
Gen AI
(not exhaustive lists)
Platforms
Financial Institutions
Travel Sector
Sport Landscape
Music Industry
Film Industry
Shopping/Retail
Gaming
News Media
Intelligence Research
Education, Online Harms and Trust & Safety
Education is key to Online Safety
Not just Children, but Parents and Adults
Understanding the risks, the dangers, the reporting process
To educate about the Online Harms and processes
Leads to response by Platform T&S teams
Policies, Products, Legal and Operations
The Global Landscape
It’s complex !
We have a global society online with their own views,
thoughts, attitudes and ‘interests’
New laws are being introduced which means ‘platforms’ must assess the risk to children and take appropriate action:��* Safety By Design��global-csam-legislative-overview-2024-full-report-1.pdf����
Global Laws &
Regulations
Age Verification &�Assurance
New Laws around the world have imposed Age Restricted access in some form:
Australia – Social Media Ban for Under 16s
France, India, New Zealand – under discussion
UK – Age Verification introduced for adult pornography sites
Social Media also introducing age checks
Users just now using VPNs !
What is Online Child Safety
Terminology:
Child Protection, Child Safeguarding, Minor Safety
��What’s your understanding of Online CSEA ?
QUESTION ?
Child Sexual Exploitation & Abuse (CSEA)
Includes:
Although still legal term, advised not to use
The landscape of Terminology
Cyber Located Sexual Violence
Tech Facilitated Sexual Abuse
Tech Facilitated Sexual Violence
Tech Facilitated Financial Extortion
Tech Facilitated Blackmail
Sexploitation
Sexual Extortion
Tech Facilitated Child Abuse
Cyber Located Sexual Violence
Financial Sexual Extortion
Financially Motivated Sextortion of Minors
Online Grooming Solicitation
Criminal CSAM
CSEAM
Financial Grooming
Webcam Blackmail
Child Criminal Exploitation
(CCE)
Tech Assisted CSEA
Online CSEA
����Individual platforms have their own policies about what is/isn’t allowed��Internal & external monitoring & reporting procedures��The type of platform can depend on what is allowed��Some Countries implement stricter regulations on certain content
PLATFORM
POLICIES
����Refers to sexualised content depicting minors that is exploitative in nature but does not fall within the classification of nationally illegal child sexual abuse material (CSAM). ��It can also include non-illegal images in a series with CSAM as exploitation material, due to its investigative relevance and the context of exploitation in which it was generated��CSEM is not reportable content
Child Sexual
Exploitation Material
(CSEM)
�Child Sexual Abuse Material (CSAM) has different legal definitions in different countries. ��The minimum defines CSAM as imagery or videos which show a person who is a child and engaged in or is depicted as being engaged in explicit sexual activity.��Child defined as anyone under 18��UK Categorisation��Category A: Images involving penetrative sexual activity; images involving sexual activity with an animal or sadism.�Category B: Images involving non-penetrative sexual activity.�Category C: Other indecent images not falling within categories A or B. �
Child Sexual
Abuse Material
(CSAM)
Wikimedia Foundation Reminder
Prohibited content includes (but is not limited to):
Photographs, drawings, renderings, or videos of minors depicting:
Nude body with a focus on genital areas, where that image has no
obvious educational value aligned with the projects' purpose
Implicit or explicit sexual intercourse with or in proximity of a minor
Simulated sexual activity (including if the minor is fully clothed)
Masturbation
A minor depicted with or engaging in sexual activity with sex toys
Other "sexually explicit conduct" as defined in United States law
Sharing links to (or purporting to be to) such material is also in violation
of this policy. Other material may be removed by the Foundation at the
discretion of the Legal department as a primary office action pursuant to the
What can be OCSEA�Examples
Self Generated (selfies) Nudism
Offender Networks Inappropriate Teen “chat” Sextortion Cartoon
Grooming Humour
Viral Manga/Anime
Family pics Upskirting
URL Sharing Outrage
AI/Photo shop Glorification
Deepfakes Non CSAM imagery
Victimisation ‘Fantasy’
Detection Avoidance media
Detection Avoidance
General International Law
Bad actors ‘test the system’
They are aware of PDNA (hash matching) *
* Hashing – the use of media ‘fingerprinting technology’
Edited images (non violative)
‘Innocent’ emoji use
Open ended questions/comments
“we have the same interests”
Wording can be ‘subjective’ / innocent
“are you alone?”
What indicators can help you
Indicators
Consider slang/emoji use
Different ages have different interests
Music, Film, TV, Fashion etc
References to school/education settings
Tattoos, Piercings, body aesthetics
Driving, smoking, medical
Rooms, noise, pictures
Although still under discussion/confirmation��Generally accepted that any CSEA content created by AI is classed/assessed the same as non-AI generated content�
Artificial Intelligence
Generated (AI)
Content
General International Law
Child is anyone under 18 (no matter which Country)
Indication of sexualised content
Encouragement, Promotion of CSEA content
Criminal Offence of:
Possession, Distribution, Creation of CSEA content
CSAM Classification
Different Countries use different types of classification:
Examples:
Cat A, Cat B , Cat C
Class 1, Class 2, Class 2A, Class 2B
Classifying Images
You are not required to assess images to an
accurate classification – just awareness
Only Law Enforcement and other Agencies are legally
allowed to assess the correct classification
However, you must be aware of when priority action is
required:
The Copine Scale
The COPINE Scale | ||
1 | Indicative | Non-erotic and non-sexualized pictures showing children wearing either underwear or swimsuits from either commercial sources or family albums. Pictures of children playing in normal settings, in which the context or organization of pictures by the collector indicates inappropriateness. |
2 | Nudist | Pictures of naked or semi-naked children in appropriate nudist settings, and from legitimate sources. |
3 | Erotic | Surreptitiously taken photographs of children in play areas or other safe environments showing either underwear or varying degrees of nakedness. |
4 | Posing | Deliberately posed pictures of children fully clothed, partially clothed or naked (where the amount, context and organization suggests sexual interest). |
5 | Erotic Posing | Deliberately posed pictures of fully, partially clothed or naked children in sexualized or provocative poses. |
6 | Explicit Erotic Posing | Pictures emphasizing genital areas, where the child is either naked, partially clothed or fully clothed. |
7 | Explicit Sexual Activity | Pictures that depict touching, mutual and self-masturbation, oral sex and intercourse by a child, not involving an adult. |
8 | Assault | Pictures of children being subject to a sexual assault, involving digital touching, involving an adult. |
9 | Gross Assault | Grossly obscene pictures of sexual assault, involving penetrative sex, masturbation or oral sex, involving an adult. |
10 | Sadistic/Bestiality | a. Pictures showing a child being tied, bound, beaten, whipped or otherwise subject to something that implies pain.�b. Pictures where an animal is involved in some form of sexual behavior with a child. |
Tanner Scale
Tanner Stages in Females | ||
Stage | Age | Puberty Changes |
Stage 1 | 8-10 | Vellos hair, a type of fuzzy hair, may appear in the pubic area. Typically no real breast development yet. |
Stage 2 | 10-11 | Thin, straight hair primarily at the labia. Breast buds (small mounds, larger nipples) appear. Peak growth stage. |
Stage 3 | 12 | Darker, more coarse and curly hair may appear. Breast enlargement continues. Peak growth stage. |
Stage 4 | 13 | Hair similar to an adult but there's not as much. Nipples develop above breast tissue mounds. Menstrual periods typically begin, vaginal discharge may appear first. |
Stage 5 | 14-16 | Hair similar to adult in triangular pattern. Mature breasts have developed. Wider hips are common. |
Tanner Stages in Males | ||
Stage | Age | Puberty Changes |
Stage 1 | 10 | Minimal vellos-type hair grows in pubic area, similar to abdomen. Penis, scrotum, and testes are same size as in childhood |
Stage 2 | 11-12 | Thin, straight hairs grow at base of penis. Testes enlarge, skin texture and/or color changes on a bigger scrotum. |
Stage 3 | 13-14 | Darker, more coarse and curly hair grows at the genitals. Penis length (and girth) grows, testes size continues to enlarge. Sperm production can begin. |
Stage 4 | 13-14 | Pubic hair is similar to an adult's but there's not as much. Penis is notable larger, visible changes to the glans. Peak growth stage. |
Stage 5 | 15-17 | Hair is similar to an adult's in the triangular genital pattern. Reproductive organs are same size as an adult's. |
The UK….��We now have the Online Safety Act��130 topics classed as “Online Harms” – this means ‘illegal’��‘Priority’ topics:��Terrorism.�Harassment, stalking, threats & abuse offences.�Coercive & controlling behaviour.�Hate offences.�Intimate image abuse.�Extreme pornography.�Child sexual exploitation & abuse.�Sexual exploitation of adults.
The UK has very recently introduced age verification for adult pornography sites
- Sextortion - Online blackmail where individual threaten to share sexual pictures, videos, or information. They may be trying to take money or other forms of financial payment or forcing the victim to do something else.��- AI - A set of technologies that enable computers to perform a variety of advanced functions, including the ability to see, understand & translate spoken & written language, analyse data, make recommendations, & more.��- COM Networks (764 and affiliates) – This ‘group’ share, discuss, post and take part (IRL) in the most harmful acts. Covers self harm/suicide encouragement, CSEA, extremism, violence, scam, abuse etc….�
Current High Concerns:
Every area under CSEA is
concerning.
3 are prominent concern
AI/GenAI
Explosion in AI generated content: images, videos, text and voice
It is becoming more and more sophisticated
Teenagers use it – ‘nudify’ apps
Offenders use it – creation of new/manipulated content
Prompts!
Positive Negative
Child Underage
Swimming Lolita
Beach Naked
Blonde hair Posing
Slim Legs open
Grooming/Sextortion
However…..
The basics of ‘online grooming’ is usually LEGAL
No offence of older person chatting to a child
Build trust – therefore initially ‘generic’ innocent chat
How is school
What music do you like
Do you go out with friends
Only becomes a violation when becomes sexual
The invisible risk
The Brain Function
Media v Text
How the brain can interpret
what we read
Say what you see…
Draw what you think…
A House
With a door
2 windows
A chimney
�Reporting & LEA Response
Potential Platform Violations: legal-reports@wikimedia.org
Reporting potential violations: Wikimedia Trust and Safety Team: ca@wikimedia.org
Wikimedia
NCMEC
National Law Enforcement
Local Force
�Your Mental Health and Welfare
The risks;
Invisible risks
Exploring the work of Content Moderators
Coping..
Coping…
Just to name a few examples
Ongoing Risk…
Further Guidance and Support
Further Guidance and Support
Thank you for your time today��Any questions��Feel free to ask after, reach out