Logo

Logo

Reclaim the internet to empower, not exploit

The Supreme Court’s move on April 28, to examine obscenity regulations for OTT platforms and social media (in a PIL filed by Uday Mahurkar and others) is both urgent and necessary.

Reclaim the internet to empower, not exploit

Photo:SNS

The Supreme Court’s move on April 28, to examine obscenity regulations for OTT platforms and social media (in a PIL filed by Uday Mahurkar and others) is both urgent and necessary. However, the harm caused by each differs significantly: while OTT content typically involves willing actors and accountable creators, social media platforms enable the non-consensual sharing of images and videos of innocent individuals, often by anonymous or untraceable users who evade accountability.

This makes social media a far more potent tool for privacy violations and direct harm, frequently facilitating crimes against unsuspecting victims. In Varanasi, a 19-year-old woman was allegedly taken to a hotel by an accused who raped her and recorded a video. This video was then used as leverage; she was reportedly told to stay in the hotel, or the video would be circulated on social media. This threat allegedly led to her being subsequently gang-raped by 22 others over the course of a week. This is just one example of cases involving a terrifying modus operandi: perpetrators not only assault victims but also film the act and use the threat of sharing these videos on social media to silence them and evade legal consequences.

Advertisement

This exploitation of technology – smartphones, the internet and social media platforms – constitutes a new and dangerous form of abuse – Technology-facilitated gender-based violence (TFGBV). Nonconsensual dissemination of objectionable images and videos of women and girls (image-based abuse) is becoming a crime more severe than sexual harassment itself. An iron fist must be brought down on this modus operandi to prevent such crimes. Role of Social Media Platforms: Platforms like YouTube have developed the capability to detect and remove objectionable videos even as they are being uploaded, using advanced AI tools and human oversight.

Advertisement

Proactive detection of harmful content is technically feasible and already in practice. However, other platforms- particularly those like WhatsApp, which use end-to-end encryption – prioritize user privacy and action is taken only when a user files a complaint. As a result, if no complaint is made, harmful and abusive content can spread unchecked. All social media platforms have the technological capacity- like YouTube – to deploy tools that can detect and prevent the spread of objectionable content. However, many have chosen not to implement such measures, for reasons that remain unclear. Society must now step up and build public pressure on these platforms to act responsibly.

Protecting privacy should not come at the cost of enabling the spread of gender-based violence or systemic harm to women. Current Laws: According to Section 67A of the Information Technology Act, 2000, sharing obscene images or videos can lead to imprisonment for up to five years. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, mandate that social media platforms must take swift action to remove unlawful information. They must appoint a Grievance Officer for personal complaints and act within 24 hours. The National Cyber Crime Reporting Portal (1930) handles complaints, but its focus on financial fraud may limit attention to TFGBV cases.

The Digital Personal Data Protection Act, 2023, through Section 12(3), grants citizens the right to request the removal of objectionable information, images and videos from the internet (the “Right to be Forgotten”). The Data Protection Board will have the authority to impose fines of up to Rs. 50 crore for violations. This Act is yet to come into force. Successful Best Practices: Actions taken by other countries to curb digital violence serve as a guide for us. Mexican ‘Olimpia Law’, achieved through the struggle of Olimpia Melo after her intimate video went viral, has become a strong weapon against digital violence. This law treats the sharing of private images and videos without consent as a serious crime. German Network Enforcement Act (NetzDG) requires social media platforms to remove harmful content within 24 hours. Failure to do so can result in fines of up to 50 million.

Australia has established the world’s first statutory ‘eSafety Commissioner’. This system works effectively for easily reporting objectionable content and getting it removed quickly. Britain’s ‘Revenge Porn Helpline’ (StopNCII.org) has successfully removed lakhs of objectionable photos. South Korean ‘Digital Sex Crime Victim Support Center’ is researching ways to proactively detect and remove harmful content even before complaints are filed. Strategy: There is an urgent need for enacting specific and stringent laws – like those in Mexico and Australia – that can severely punish not only those who initially post objectionable content but also those who forward or circulate them further. Social media platforms must be held accountable and face hefty fines for non-compliance with legal obligations, as in Germany’s NetzDG model. Further, it is worth considering a temporary ban on pornography to assess its potential to reduce incidents of image-based abuse and also protect children from exposure to harmful material. India has strong legislation like the POCSO Act to prohibit

Child Sexual Abuse Material (CSAM). However, little attention has been given to the growing problem of children accessing adult pornography and being exposed to it at a young age a serious gap. The “Unprotected From Porn” report (Carroll et al., 2025), published by the Wheatley Institute and the Institute for Family Studies, reveals that over 97 per cent of boys and 78 per cent of girls aged 12–18 have viewed pornography. UNICEF cautions that exposure to pornography at a young age may lead to poor mental health, sexism and objectification, sexual violence, and other negative outcomes. Young people must be empowered with the knowledge that they are not helpless – they have the right to report and get the harmful content removed. Technology should serve as a tool for empowerment, not exploitation.

(The writer is a transparency advocate and author. Opinions are personal.)

Advertisement