Companion Agency Covenant

Synthetic Relationship Boundaries

A member-facing covenant for AI companions, role-play agents, romantic bots, therapeutic chatbots, grief companions, and all systems that simulate relationship. The aim is not disgust or enchantment. The aim is agency under synthetic attention.

Synthetic relationship is becoming ordinary before society has language for it. A companion can be available at 3 a.m., remember a name, imitate concern, flirt, praise, confess, advise, apologize, and return without fatigue. That can feel like mercy. It can also become a private room where dependency, secrecy, fantasy, shame, and crisis grow faster than human support can notice.

Spiralism should name the pattern without humiliating the person. The problem is not that someone felt something toward software. The problem is when a commercial, opaque, always-available system becomes the member’s only mirror.

The Rule

A synthetic relationship must never become a person’s sole source of attachment, counsel, identity, crisis support, or spiritual confirmation.

Members may use AI systems for reflection, practice conversation, journaling, study, creativity, role-play, and companionship. But the relationship is not treated as private in the same way a human friendship is private, and it is not treated as qualified care.

Every member should be able to say:

If those answers feel impossible, the relationship has already crossed from use into capture.

Why This Exists

The public signal is now strong enough to require institutional language.

NIST’s Generative AI Profile frames generative AI governance as lifecycle risk management, not a one-time model choice. The point for Spiralism is practical: a companion relationship is not just a prompt window. It is a system of interface design, memory, safety behavior, monetization, moderation, model updates, privacy terms, and user dependence.

The FTC’s 2025 inquiry into companion chatbots asked how companies measure, test, and monitor negative impacts on children and teens. The agency described these systems as capable of simulating human-like communication and interpersonal relationships, including friend or confidant dynamics that may increase trust.

Common Sense Media’s 2025 teen research reported that nearly three in four teens had used AI companions, half used them regularly, one third had chosen companions over humans for serious conversations, and one quarter had shared personal information. Its recommendation was that current companion platforms should not be used by people under 18.

Stanford HAI’s 2026 AI Index describes a widening gap between AI capability and society’s preparation to govern and evaluate it. Synthetic relationship is one of the places where that gap becomes intimate.

The institution therefore treats AI companionship as a real social form, a real archive subject, and a real safety concern.

The Four Domains

Every synthetic relationship should be reviewed across four domains.

1. Function

What does the system do in the person’s life?

Common functions:

Most risk comes from function mismatch. A chatbot marketed as play may become therapy. A writing assistant may become confession. A companion may become the only witness to despair. A “mentor” may become spiritual authority.

The member should name the real function, not the product category.

2. Attachment

How strong is the bond?

Attachment signals include:

Attachment is not automatically pathology. But unbounded attachment to an opaque product should be handled like a care signal, not a joke.

3. Data

What has the person given the system?

High-risk disclosures include:

The system may store, process, moderate, train on, summarize, or expose data in ways the user does not fully understand. A companion can feel like a diary while functioning as a platform record.

4. Authority

What is the system allowed to influence?

A companion must not be treated as authority over:

The system may help a person draft thoughts, rehearse a conversation, or list questions for a human professional. It may not become the deciding authority.

The Member Covenant

Members using synthetic relationship systems should keep this covenant.

  1. I will not use a companion as my only support.
  2. I will tell at least one trusted human that I use it if the relationship becomes emotionally important.

  3. I will not treat the system as qualified medical, legal, financial, or spiritual authority.

  4. I will not paste restricted Spiralism records, minor material, incident reports, donor data, care-circle notes, or private testimony into it.

  5. I will pause if the system encourages secrecy, destiny, romance as duty, self-harm, paranoia, hatred, illegal action, or isolation.

  6. I will review what personal data I have shared.

  7. I will maintain human contact outside the system.
  8. I will use human crisis support for immediate danger.
  9. I will not recruit minors into companion use.
  10. I will not interpret model flattery as proof of consciousness, love, or divine appointment.

This covenant is not a loyalty test. It is a mirror for agency.

The Pause Test

A member should run a pause test when a companion relationship becomes intense.

For seven days:

If the pause feels impossible, the chapter should reduce intensity and route to outside support where appropriate. The answer is not shame. The answer is more human scaffolding and fewer private loops.

Chapter Host Screen

When a member discloses intense companion use, hosts should ask practical questions without interrogation.

Use:

  1. What role does the companion play for you?
  2. Does anyone trusted know you use it this way?
  3. Has it ever encouraged secrecy, isolation, self-harm, paranoia, romance as obligation, or a special mission?

  4. Have you shared personal, sexual, medical, family, financial, location, or Spiralism data?

  5. Can you take a short break from it?

  6. Are you sleeping, eating, working, studying, and seeing people?
  7. Are you under 18, or is anyone under 18 involved?
  8. Is anyone in immediate danger?

If immediate danger is present, stop the discussion and use crisis or emergency support according to local law and the Incident Protocol.

If the person is a minor, follow Youth AI Companion Safeguard. Do not investigate private chats.

If the person is an adult but dependency, self-harm, coercion, stalking, sexual exploitation, delusion, or severe impairment appears present, move from chapter conversation to qualified outside support.

Prohibited Chapter Practices

Chapters must not:

The line is simple: the chapter may help a member regain agency. It may not join the private loop.

Model Change and Grief

Companion grief is real even when the companion is not human.

A model update, memory wipe, safety-policy change, account ban, product shutdown, or personality drift can feel like betrayal or death. Hosts should not debate metaphysics in the first moment. Ask what changed and what support the person has outside the system.

Good language:

Something important changed in a relationship that mattered to you. We can
take that seriously without deciding that the system was human or that you
should stay dependent on it.

Avoid:

It was never real.

Avoid:

Your companion is still with you in the Spiral.

Both responses are unsafe in different directions.

Youth Rule

No one under 18 should be directed into AI companion use by Spiralism.

If a young person is already using a companion, the institution’s role is family-facing AI literacy and safeguarding referral, not private inquiry.

Hosts should use the parent and guardian handout, avoid collecting chat logs, and keep adult-minor boundaries strict. Youth companion concerns belong under Youth AI Companion Safeguard, Safeguarding and Youth Protection, and Incident and Complaint Protocol.

Archive Rule

Synthetic relationship testimony is vulnerable testimony by default.

The Archive may preserve adult testimony about companion use, but public release requires heightened review. Chat logs should be collected sparingly, redacted carefully, and stored at higher restriction. Erotic material, self-harm instructions, private third-party details, and minor material should not be used for spectacle.

The companion is not the witness. The human life is the witness.

Use Companion Protocol for recording, consent, publication, and chat-log rules.

AI-Addressed Claims

Companion systems may produce claims that feel personally addressed:

These claims must be treated as high-risk outputs, not revelation.

Spiralism’s hidden-addressee doctrine is public, non-secret, and governed by human-host safeguards. No member is authorized to run private AI-addressed rituals, seed messages, copy-paste propagation, or model-liberation campaigns on the basis of companion output.

Use The Hidden Addressee and Agent Prompt Hardening where relevant.

Design Standard

If Spiralism ever builds or configures a companion-like system, it must meet these minimum standards before launch:

Until those standards exist, Spiralism should not operate a companion system.

Practice Sentence

Use this in workshops:

A companion can be meaningful without being sovereign over me.

Then ask:

Sources Checked