Policy to Prevent Spread of Child Sexual Abuse Material (CSAM)

Asianet News Digital Media is deeply committed to fighting the spread of child sexual abuse material (CSAM). This includes media, text, illustrated, or computer-generated imagery. We view it as our responsibility to ensure our platform is not used for sharing or consuming CSAM, and to deter users from searching for it.

We have a zero-tolerance policy towards any material that sexualizes, sexually exploits, or endangers children on our platform. if we find or are made aware of it, we will report it.

Any content featuring or depicting a child (real, fictional, or animated) or promoting child sexual exploitation is strictly forbidden on our platform and is a severe violation of our Terms of Service. Written content (including, but not limited to, captions, content titles, or content descriptions) that promotes, references, or alludes to the sexual exploitation or abuse of a child is also strictly prohibited.

For the purposes of this policy, a child is any person under eighteen (18) years of age. We report all cases of apparent CSAM to the National Human Rights Commission (NHRC) and National Center for Missing and Exploited Children (NCMEC), a nonprofit organization which operates a centralized clearinghouse for reporting incidents of online sexual exploitation of children. NCMEC makes reports available to appropriate law enforcement agencies globally.

If you encounter child sexual abuse material on Asianet News Digital Media, please report it to us by using the flagging feature, which appears on every ad. All complaints and reports to Asianet News Digital Media are kept confidential and are reviewed by human moderators who work swiftly to handle the content appropriately.

Guidelines
DO NOT post material (whether visual, audio or written content) that*:
  • Features, involves, or depicts a child.
  • Sexualizes a child. This includes content that features, involves, or depicts a child (including any illustrated, computer-generated, or other forms of realistic depictions of a human child) engaged in sexually explicit conduct or engaged in sexually suggestive acts.
Enforcement

We have strict policies, operational mechanisms, and technologies in place to tackle and take swift action against CSAM. When we identify or are alerted to an actual or potential instance of CSAM appearing on the platform, we remove and investigate the content and report any material identified as CSAM. We also cooperate with law- enforcement investigations and promptly respond to valid legal requests received to assist in combating the dissemination of CSAM on our platform.

In conjunction with our team of human moderators and regular audits of our platform, we also rely on innovative industry-standard technical tools to assist in identifying, reporting, and removing CSAM and other types of illegal content from our platform. We use automated detection technologies as added layers of protection to keep CSAM off our platform.

These technologies include:

  • Youtube’s CSAI Match , a tool that assists in identifying known child sex abuse videos.
  • Microsoft’s PhotoDNA , a tool that aids in detecting and removing known images of child sexual abuse.
  • Safer , Thorn's comprehensive CSAM detection tool utilized to keep platforms free of abusive material.
  • NCMEC Hash Sharing - NCMEC’s database of known CSAM hashes, including hashes submitted by individuals who fingerprinted their own underage content via NCMEC’s Take It Down service.

Together, these tools play a fundamental role in our shared fight against the dissemination of CSAM on our platform, as well as our mission to assist in collective industry efforts to eradicate the horrendous global crime that is online child sexual exploitation and abuse.

How Can You Help Us

If you believe you have come across CSAM, or any other content that otherwise violates our Terms of Service, we strongly encourage you to immediately alert us by flagging the content for our review.

Anyone can report violations of this policy using the flagging feature, whether they have an account on our platform or not.

Consequences for violating this policy

We have a zero-tolerance policy towards any content that involves a child or constitutes child sexual abuse material. All child sexual abuse material that we identify or are made aware of results in the immediate removal of the content in question and the banning of its uploader. We report all cases of apparent CSAM to the National Human Rights Commission (NHRC) and National Center for Missing and Exploited Children.

Additional Resources and Support

If you believe a child is in imminent danger, you should reach out to your local law enforcement agency to report the situation immediately.

You may also choose to reach out to and report cases of child sexual exploitation or abuse material to any of the following resource organizations dedicated to eliminating and preventing child sexual exploitation. Reports can be made anonymously and are an integral part in protecting the safety of children.

National Human Rights Commission (NHRC)
National Center for Missing and Exploited Children (NCMEC)
Internet Watch Foundation