[ad_1]
An Economic Times Report According to the report, the government may ask social media companies to provide detailed information about the steps taken to prevent pornographic and child sexual abuse content on their platforms. The government had issued a notice to these platforms on October 6. The report states that in the notice sent by Meity, it had asked these companies to permanently block such content which is related to these sensitive issues.
The government wants social media companies to implement technical measures on their platforms along with an automated tool that can identify such content and block them permanently.
The report says the government has warned that in case of non-compliance, those companies risk losing the safe harbour provision granted to them under the Information Technology Rules 2021. The rules state that all social media intermediaries must deploy “advanced measures, including automated tools” to not only block “obscene, pedophilic” content, but also actively identify any information that “in any form depicts any act of rape, child sexual abuse, etc.”
According to the report, the notice was followed by a response from YouTube and Telegram, in which they said that they have a “zero tolerance” policy for pornographic and child sexual abuse content on their platforms and that they have invested heavily in technology and teams to fight child sexual abuse online.
The platforms say that in the second quarter of 2023 they removed more than 94,000 channels and over 2.5 million videos for violating their child safety policies.
[ad_2]