About Fansit

Our Approach to Trust & Safety

5 min read

Our Approach to Trust & Safety

Fansit's trust and safety program rests on five things: identity verification, age verification, content moderation, payment integrity, and zero tolerance for the worst categories.

Identity verification (KYC)

Every creator goes through full identity verification before they can post or earn. We use Incode to verify a government-issued ID and run a liveness check that confirms the same person is operating the account. No anonymous monetization on Fansit.

Agencies and operators that manage creator accounts are also verified, with a separate registration flow and contractual obligations under the Agency & Operator Terms.

Age verification for fans

Fans aged 18+ can use Fansit. Fans located in any U.S. state or country that requires age verification for adult-content websites must complete an ID-based age check before viewing adult content. Verification is handled by Incode and follows the same lightweight flow used for creators.

Content moderation

Every upload — images, video, audio, livestreams — is run through automated moderation tools that scan for CSAM, prohibited content, undisclosed synthetic content, and other policy violations. Flagged content is reviewed by trained human moderators before it goes live or stays online.

For potentially identity-bearing synthetic content, three independent moderators review the content (the "Reasonable Person Test") before it's restored on appeal.

Payment integrity

Every transaction passes through fraud-prevention checks. Bad-faith chargebacks, abnormal dispute rates, and known fraud signatures all trigger account review. Fans who repeatedly dispute legitimate transactions can lose access to the platform.

Zero tolerance

Five things result in immediate, permanent termination — and reporting to law enforcement and the National Center for Missing & Exploited Children (NCMEC) where applicable:

  1. CSAM, age-play, or any content depicting an apparent minor in a sexualized context.
  2. Non-consensual intimate imagery — recording, sharing, or threatening to share content of any individual without their documented consent.
  3. Sex trafficking, human trafficking, or content promoting either.
  4. Bestiality.
  5. Real, non-consensual violence or torture.

The same rules apply to AI-generated content. There is no carveout for synthetic CSAM or synthetic non-consensual content.

Reporting

If you see something that violates these rules, report it through the in-product report button or email abuse@fansit.com. Reports are confidential.

For copyright-specific issues, use the DMCA process.

Transparency

We publish a quarterly safety report on legal.fansit.com/safety covering takedowns, KYC denials, NCMEC reports filed, and similar metrics.

Was this article helpful?