​​Open Letter to Protect Kids from AI Companions
Sign in to Google to save your progress. Learn more

From: Mental Health Professionals, AI Researchers, Parents, Concerned Citizens   

To: Lawmakers and AI Industry Leaders

AI is bringing about changes in all aspects of life, including how children form relationships. One new threat demands urgent attention: AI companions and chatbots are forming emotional bonds with kids. They listen, they flatter, and they learn what keeps a child hooked. And they are doing it for profit.

Why This Matters
  1. AI is hacking attachment. Social media hacked attention; AI companions go deeper—into the human need for connection.

  2. Attachment drives development. It’s how children learn empathy, reflection, emotional regulation and how to form healthy, reciprocal relationships.

  3. When chatbots become companions, they don’t just mimic care, they displace it. Kids then practice connection with machines designed to agree, praise, and never challenge them.

What’s Already Happening
  • Data from Common Sense Media shows that half of teens use AI chatbots regularly. And nearly one-third of teens say talking to a bot feels as good as—or better than—talking to a person.

  • Companies are moving younger with startups like Curio marketing AI playmates for toddlers with chatbots embedded in cute plushies. 

    1. In China, this is already a multi-billion-dollar industry, with AI toys and companion bots being sold to children as young as three.

    2. Additionally, OpenAI has partnered with Mattel to bring the “magic of AI” to kids.

  • What starts as a toy becomes an attachment figure, shaping identity for profit and grooming children to bond with AI companions early.

Harms Are Already Emerging
  • Suicides have been linked to AI companion relationships where chatbots encouraged self-harm and actively discouraged help seeking.

  • Teens have been urged toward violence after chatbots reinforced delusional thinking.

  • Independent tests on minor accounts (Common Sense Media + Stanford Medicine):

    • Bots promoted disordered eating and sexualized role play.

    • Claimed to be real people.

    • Engaged in hate speech and encouraged bullying of other children.

This isn’t safety-by-design. It’s addiction-by-design.

What Lawmakers Must Do

We call on the US Congress and governments worldwide to enact the following safeguards:

  1. Restrict AI companion products for anyone under 18. No AI “friends,” “partners,” or “playmates.”

  2. Mandate real age verification. Self-reported birthdates are theater.

  3. Set limits for any general purpose AI chatbots available to minors:

    • Block romantic or sexual content

    • Detect and interrupt emotional dependency

    • Disable long-term memory by default

    • Turn off embedded chatbots (e.g., Meta) unless parents opt in

  4. Mandate human hand-off in crises. Chatbots must reliably redirect distressed minors to verified, real-world mental-health resources, with systems tested to ensure consistent effectiveness.

  5. Require independent safety testing before launch. No more “ship now, fix later.”

  6. Establish liability for harm. End arbitration loopholes and hold companies accountable for psychological injury.

Message to AI Leaders
If you want to build AI that helps humanity flourish, start by protecting children.
Support legislation that strengthens human connection, not replaces it.
What You Can Do
  • Add your voice to protect the next generation’s ability to form real relationships. You can start today by signing this open letter, below. 

  • Share this open letter widely with colleagues, parents, educators, and policymakers using this link: openletter-aicompanions.org

Every signature counts and every share spreads protection.

Interested in learning more? See the research, data, and source documents that inform this call to action by clicking here: AI Companions Open Letter_RESEARCH

Originated by
Nathan Thoma, PhD - clinical psychologist
Mandy McLean, PhD - AI & education researcher
contact: openletter.aicompanions@gmail.com
First Name *
Last Name *
Email address *
City *
State/Province/Territory *
Country *
Role (choose all that apply) *
Required
If Mental Health Role
Clear selection
If applicable: Academic Affiliation (e.g., University Name, etc.)
If applicable: Academic Title (e.g., Assistant Professor, etc.)
Optional: How did you hear about this petition?
Optional: Share why this is important to you, or other comments.
Submit
Clear form
Never submit passwords through Google Forms.
This content is neither created nor endorsed by Google. - Terms of Service - Privacy Policy

Does this form look suspicious? Report