Sailgram Standards Against Child Sexual Abuse and Exploitation (CSAE)
Last Updated: April 2025
Sailgram is committed to maintaining a safe and secure platform that protects children from exploitation, abuse, and harm.
We strictly prohibit any form of Child Sexual Abuse and Exploitation (CSAE) and take all necessary steps to prevent,
detect, and report such content and activity on our platform.
Zero Tolerance for CSAE
We strictly prohibit any content, behavior, or interactions that facilitate or promote Child Sexual Abuse. This includes but is not limited to:
- Child Sexual Abuse Material (CSAM) (real or AI-generated).
- Grooming (adults soliciting minors)
- Sexualization of minors (including suggestive imagery, text, or roleplay).
- Predatory behavior (e.g., soliciting personal information, coercion).
Violations result in immediate account termination and reporting to law enforcement.
Safety Mechanisms
- AI-Driven Red Flags for the automated detection of child sexual abuse material (CSAM)
- In-app reporting (1-click block and report)
- 24/7 priority moderation for CSAE reports.
Compliance with Child Safety Laws
Sailgram adheres to all relevant local and international child protection laws, including:
- The U.S. Children’s Online Privacy Protection Act (COPPA).
- The U.K. Online Safety Act.
- The EU General Data Protection Regulation (GDPR) for children.
- Other applicable laws that mandate the protection of children online.
Dedicated Child Safety Contact
For any child safety concerns, authorities, parents, or users can contact our dedicated
Child Safety Team at guardian@sailgram.app.
Ongoing Commitment to Safety
Sailgram continuously updates and strengthens its child protection measures in alignment with best practices and regulatory requirements.
⚓ Protect the young crew and keep our ship safe! ⚓