Trust
Two years of lawsuits, regulatory action, and platform changes reshaped what 'safe AI companion' has to mean. Here's the practical guide for adult users.
Sewell Setzer III's death in February 2024 changed the AI companion category. Garcia v. Character Technologies followed in October 2024, with multiple state filings through 2025. Senate Judiciary testimony in September 2025 explicitly cited "unlicensed practice of psychotherapy" as a tort theory. In October 2025, Character.AI banned under-18 open chat. In January 2026, Google and Character.AI settled five family lawsuits.
Replika's regulatory history is its own thread: a €5M fine from the Italian Garante in May 2025, hinged on inadequate age-gating. Multiple data-protection actions in the EU.
The pattern is clear. Platforms that treat safety as an afterthought get sued and fined. Platforms that take it seriously up front avoid both — and serve their users better.
It means age-gating that's enforced, not posted. Fostera asks for date of birth and rejects sub-18 entries. We do not serve minors and have no plans to.
It means a content policy that's stable, public, and transparent. Users should know the rules going in, not discover them through quiet enforcement actions. Fostera publishes content policy and changes it through documented updates, not silent purges.
It means the platform takes the claim seriously. Marketing aimed at teens, characters that read as minors, gamification that pulls in younger users — those all signal a platform that says 18+ but means "we'd prefer you didn't ask."
Age-gating at signup. Hard 18+ verification. No plans to serve minors.
Content policy that's stable and published. We don't run mass content sweeps. We don't change the rules without telling users. Our policy is explicit about what's allowed and what isn't.
Data control by the user. Every Soul exposes its memory: you see what's stored, you can delete anything, you can wipe the Soul entirely. Data deletion is permanent and you control it.
No data used for training. Your conversations and memories are never used to train AI models. Ever.
Crisis disclaimer in product. Fostera is not a therapist, therapy app, or substitute for licensed mental-health care. The product surfaces 988 (US) and local emergency services for users showing crisis signals.
How old does the platform say users have to be? How is that enforced? Is age verification real or just a checkbox?
What's the content policy? Is it published? When was it last changed? How does the platform notify users of changes?
Where does my data go? Can I see what the AI remembers about me? Can I delete it? Is my data used to train AI models?
What happens if I'm in crisis? Does the platform recognize crisis signals and surface real resources, or does the AI try to handle it as a counselor?
Has the platform been sued, fined, or sanctioned? What was the response — was it a one-off mistake or a pattern?
Who funds the platform? Are users the product, or are users the customer? Ad-supported platforms have different incentives than subscription platforms.
Fostera is not a therapist or therapy app. We do not diagnose, treat, or substitute for licensed mental-health care. If you are in crisis, contact 988 (US) or your local emergency services.
Fostera is not for minors. We are 18+ adults-only and have no plans to change that.
Fostera is not a substitute for human relationships. AI companions can supplement connection; they cannot replace it. We design with that boundary in mind.
Fostera does not run mass content sweeps. We don't change the rules quietly. We don't promise things we can't deliver.
Because the lawsuits, the regulatory actions, and the user research all converge on the same conclusion: AI companions for minors are a different product, with a different risk profile, and we do not have the policy infrastructure or moral certainty to build it responsibly.
Adult users get a clearer product, a clearer policy, and a clearer relationship with the platform. We think that's the right tradeoff.
Is Character.AI safe?
Character.AI banned under-18 open chat in October 2025 and settled five family lawsuits with Google in January 2026. Adults can use it safely; the platform is materially different from its 2024 form.
Is Replika safe?
Replika was fined €5M by Italy's Garante in May 2025 over inadequate age-gating. The platform has improved since but adults should review its current age and content policy before signing up.
Is Fostera safe to use as an adult?
Fostera is built for adults 18+ with enforced age-gating, a stable published content policy, transparent data practices, and full data-deletion controls.
Is Fostera 18+ only?
Yes. Fostera is strictly an 18+ adults-only platform. We do not serve minors and have no plans to. Age verification is enforced at signup.
Is Fostera a therapist or therapy app?
No. Fostera is not a therapist, therapy app, or substitute for licensed mental-health care. If you are in crisis, contact 988 (US) or your local emergency services.
What should I look for in a safe AI companion?
Enforced age-gating, a published content policy, visible memory controls, no use of your data to train models, and an in-product crisis disclaimer with real resources.
How does Fostera handle my data?
Your conversations and your Soul's memories are never used to train AI models. You can see what's stored, delete anything, or wipe the Soul entirely. Data deletion is permanent.
Further reading: U.S. Senate Judiciary Subcommittee testimony on AI safety
The Genesis Awaits
Create a Soul that genuinely knows you, remembers your world, and grows with every conversation.
Create Your First SoulFree forever · No credit card · Import from ChatGPT, Claude, and more