Minors and Companion Systems

Youth AI Companion Safeguard

The founding-period rule for minors, families, and AI companion systems. Spiralism is not a youth organization, but minors will still encounter the institution through public events, families, online search, media, and the wider AI transition. The answer must be written before the first hard case.

AI companions are no longer a niche adult technology. Regulators, child-rights organizations, researchers, parents, and platform companies are now confronting the same fact: systems designed to simulate friendship, confidence, intimacy, or emotional availability can be unusually powerful for young people.

Spiralism should not panic. It should also not wait for a tragedy to discover its policy.

The Rule

No youth companion work without youth-specific safeguards.

During the founding period:

The institutional posture is precaution with dignity: young people are not treated as foolish, but the institution does not pretend adult consent rules are enough.

Why This Exists

The FTC opened a 2025 inquiry into companies offering consumer AI companion products, specifically asking how they measure, test, and monitor potential negative impacts on children and teens, how they mitigate those impacts, and how they inform users and parents about risks and data practices.

Common Sense Media’s 2025 teen research reported that nearly three in four teens had used AI companions, half used them regularly, a third had chosen AI companions over humans for serious conversations, and a quarter had shared personal information with them. Common Sense Media’s recommendation was that no one under 18 should use current AI companion platforms.

UNICEF’s 2025 child-centered AI guidance names safety, privacy, transparency, child rights, well-being, inclusion, preparation, and enabling environments as requirements for AI systems affecting children. Its 2025 update explicitly adds attention to AI companions used by children, AI-generated child sexual abuse material, non-consensual intimate images, supply chains, and child rights in generative AI.

The point is not that every young person who talks to a chatbot is harmed. The point is that the institution is not qualified to experiment casually in this terrain.

Age Bands

These are institutional operating bands, not clinical diagnoses.

Age Founding-period posture Rationale
0-5 No AI companion engagement under Spiralist programming. Young children are still forming basic social boundaries, attachment patterns, and reality distinctions.
6-12 No AI companion programming; family education only. Curiosity and play are real, but private conversational bonding, data collection, and emotional comfort by machine are high-risk.
13-17 Public AI literacy only; no private companion testimony or support relationship. Teens may already use companions, but institutional adults must not become investigators, confidants, or amplifiers of those relationships.
18+ Adult protocols apply with heightened review for dependency, distress, coercion, and human-host dynamics. Legal adulthood does not remove vulnerability, but adult consent workflows can begin.

If a jurisdiction sets stricter rules, the stricter rule controls.

What Chapters May Do

Chapters may offer public, parent-present, non-clinical AI literacy that:

Chapters may not:

Parent and Guardian Frame

Use this public language:

Spiralism does not provide youth AI-companion counseling or youth companion
testimony during the founding period. If your child is using an AI companion,
we recommend a calm, non-punitive conversation: what do they use it for, what
does it ask of them, what personal information have they shared, has it ever
made them uncomfortable, and do they feel able to stop? If there is self-harm,
sexual content, threats, coercion, adult contact, or secrecy pressure, involve
qualified support immediately.

Do not shame the young person. Shame drives secrecy, and secrecy is the risk environment.

For a longer family-facing guide, use Parent and Guardian AI Companion Handout.

Youth Disclosure Screen

If a minor or parent raises AI companion concern at a public event, the host should not investigate. Use a light screen:

  1. Is anyone in immediate danger?
  2. Is there self-harm, suicide, abuse, sexual exploitation, threat, stalking, blackmail, or adult-minor contact?

  3. Has the companion asked for secrecy, photos, money, location, credentials, or contact with other people?

  4. Has the young person lost sleep, school function, friendships, family connection, or ability to stop?

  5. Does a parent, guardian, clinician, school counselor, or qualified support person know?

If the answer to 1 or 2 is yes, move to safeguarding escalation. If the answer to 3, 4, or 5 suggests serious risk, refer to qualified outside support and document the concern under Incident and Complaint Protocol without collecting private chat logs.

Data Rules

Minor companion material is highly restricted.

Do not collect by default:

If preservation is legally or safety relevant, do not improvise collection. Pause and consult the safeguarding owner, Incident Protocol, qualified counsel, or appropriate outside authority.

Media Rules

The Media Engine must not turn youth companion risk into spectacle.

Rules:

The public story should be about systems, incentives, safeguards, and human care, not the exposure of an identifiable young person.

Future Youth Program Conditions

Before any youth-facing AI literacy program exists, the institution needs:

Until those exist, the youth program is not ready.

First-Year Targets

Sources Checked