Effective 2026-05-09 · v1.0
Content Moderation Policy
This policy describes how we moderate user content and handle reports. It complements the Acceptable Use Policyand aligns with EU Digital Services Act (DSA) obligations where applicable.
1. What we moderate
- Public flashcard decks (/market) and presentations (
/present/[token]). - Profile bios, avatars, and nicknames.
- Messages and screen-shared content in study rooms.
- Reports submitted by users via the report button or by email.
Private content (personal notes, decks marked private, AI conversations) is not proactively scanned. We may access it only on a verified user report, a lawful authority request, or a clear safety threat (e.g. CSAM detection through automated hashing — required by law).
2. How reports are handled
- Report submitted via in-app button or moderation@getunimate.com.
- Confirmation email within 1 hour.
- First human review within 24 hours; clear-cut violations within 4 hours.
- Action taken: no action, content removed, account warned, account suspended, account banned.
- Reporter and reported user are both informed of the outcome with reasoning.
3. Tooling
- Automated CSAM detection (PhotoDNA / hash-list) — applied to all uploaded images. Hits go directly to NCMEC and equivalent authorities.
- Spam detection on public decks (rate of similar content, suspicious link patterns).
- No automated content-policy enforcement on text — every text-based action is human-reviewed.
4. Appeals
If we removed your content or restricted your account and you think we got it wrong, reply to the action email or write to moderation@getunimate.com within 30 days with "Appeal" in the subject. A different team member than the original reviewer will look at it; appeals decided within 7 days.
For decisions affecting EU residents, you may also use the EU's out-of-court dispute settlement procedure described under DSA Art. 21.
5. Legal-authority requests
We comply with valid legal process from Switzerland, the EU, and Kazakhstan, scoped to the specific data described in the order. We notify the affected user before disclosure unless prohibited by law. Annual statistics on requests received and complied-with are published in the trust center.
6. Trusted-flagger contact
Recognised trusted flaggers (DSA Art. 22) and equivalent KZ/CH child-safety bodies should email moderation@getunimate.com; these reports are queued ahead of regular user reports.
7. Transparency
Every quarter we publish on the trust center: number of reports received, average response time, percentage of reports leading to action, breakdown by category, appeals statistics.