👶
Age Requirements & Verification
Emhash is designed for users aged 13 and older. We implement strict age verification measures to protect children.
- Minimum Age: Users must be at least 13 years old to create an account
- Age Verification: Date of birth is required during registration and verified through authentication providers
- Parental Consent: Users under 18 are encouraged to use the app with parental guidance and supervision
- Account Termination: Accounts found to belong to users under 13 are immediately terminated
🛡️
Prevention of Child Sexual Abuse & Exploitation (CSAE/CSAM)
We have zero tolerance for child sexual abuse material (CSAM) and child sexual abuse and exploitation (CSAE). Our platform employs multiple layers of protection.
⚠️
Zero Tolerance Policy
Any content, behavior, or communication involving child sexual abuse or exploitation results in immediate account termination, content removal, and reporting to the National Center for Missing & Exploited Children (NCMEC) and law enforcement authorities.
- Proactive Detection: Automated systems scan all uploaded content for potential CSAM using industry-standard detection technologies
- Hash Matching: We utilize PhotoDNA and other hash-matching technologies to identify known CSAM
- Human Review: Flagged content is reviewed by trained moderation teams within 24 hours
- Immediate Reporting: Confirmed CSAM is reported to NCMEC within 24 hours as required by law
- User Blocking: Users who attempt to share CSAM are permanently banned and reported to authorities
- Metadata Preservation: All evidence is preserved and provided to law enforcement upon request
🔍
Content Moderation & Safety Measures
Our multi-layered content moderation system combines automated technology with human oversight to maintain a safe environment.
- Automated Filtering: AI-powered systems detect and block inappropriate content, including explicit material, violence, and harmful language
- Community Guidelines: Clear, comprehensive guidelines prohibit content that endangers minors or violates safety standards
- User Reporting: Easy-to-use reporting tools allow users to flag concerning content or behavior
- 24/7 Monitoring: Dedicated moderation team reviews reported content around the clock
- Privacy Controls: Users can control who sees their content and who can interact with them
- Location Privacy: Precise location data is never shared publicly; only general area indicators are shown
🚨
Reporting Mechanisms
Every post, profile, and interaction includes a "Report" button. Reports are prioritized based on severity, with child safety concerns receiving immediate attention. Users receive confirmation when their report is received and updates on actions taken when appropriate.
📋
Report Handling & Response
We take all reports seriously and follow a structured process to investigate and respond appropriately.
- Immediate Triage: Child safety reports are flagged as critical and reviewed within 1 hour
- Investigation: Trained safety specialists investigate all aspects of reported content or behavior
- Swift Action: Violating content is removed immediately; accounts are suspended pending investigation
- User Communication: Reporters receive acknowledgment and, when appropriate, outcome notifications
- Appeals Process: Users can appeal moderation decisions through a fair review process
- Continuous Improvement: Report patterns are analyzed to improve detection and prevention systems
👮
Cooperation with Authorities
We work closely with law enforcement agencies and child safety organizations to protect minors and prosecute offenders.
- NCMEC Reporting: All CSAM incidents are reported to the National Center for Missing & Exploited Children (NCMEC) as required by federal law
- Law Enforcement Requests: We respond promptly to valid legal requests from law enforcement agencies worldwide
- Evidence Preservation: User data and content related to investigations are preserved and provided to authorities
- Proactive Cooperation: We proactively share information about potential threats to child safety with appropriate authorities
- Industry Collaboration: We participate in industry coalitions and share best practices for child safety
- Transparency Reports: We publish annual transparency reports detailing our safety efforts and cooperation with authorities
🤝
Our Partnerships
Emhash partners with organizations including the National Center for Missing & Exploited Children (NCMEC), Internet Watch Foundation (IWF), and Technology Coalition to combat child exploitation and continuously improve our safety measures.
🔒
Built-in Safety Features
Our platform includes multiple safety features designed to protect users, especially minors.
- Private by Default: New accounts have privacy settings enabled by default
- Block & Mute: Users can block or mute other users to prevent unwanted interactions
- Message Filtering: Automated systems filter messages for inappropriate content before delivery
- Location Controls: Users control whether and how location information is shared
- Safety Resources: In-app resources provide guidance on staying safe online
- Crisis Support: Direct links to crisis helplines and mental health resources are readily available
📚
Education & Resources
We believe education is key to creating a safer online environment for everyone.
- Safety Center: Comprehensive safety resources and guidelines available in-app and on our website
- Parent Guides: Resources to help parents understand the platform and guide their children's use
- Digital Literacy: Educational content about online safety, privacy, and responsible digital citizenship
- Community Guidelines: Clear, accessible guidelines explaining acceptable behavior and content
- Regular Updates: Users receive notifications about new safety features and best practices
🔄
Continuous Improvement
Child safety is an ongoing commitment. We continuously evaluate and enhance our safety measures.
- Regular Audits: Third-party security and safety audits are conducted annually
- Technology Updates: We invest in the latest detection and prevention technologies
- Team Training: Moderation teams receive ongoing training on child safety and trauma-informed practices
- User Feedback: We actively solicit and incorporate user feedback on safety features
- Policy Reviews: Safety policies are reviewed and updated quarterly to address emerging threats
- Research Participation: We contribute to and participate in child safety research initiatives