The Delhi High Court has asked the Centre and Internet majors such as Google and Facebook to submit their stand on removing offending images of children and preventing them from resurfacing. The court’s order came while hearing a plea by a woman who wanted removal of her objectionable photographs, which were allegedly taken when she was a minor. The plea stated that at that time, she was studying in a well known school.
HC Order on Child Pornography
Since the plea was filed in July this year, Google, which owns Youtube; and Facebook, which owns Instagram, have removed around 49 URLs — shared by the enforcement agencies — of the offending contents. However, the court wanted answers on the vexed issue of how content can be blocked permanently from resurfacing once it is identified as offending. Justice Vibhu Bakhru said while enforcement agencies will report the image to Facebook and Google as an when it resurfaces, the problem is that somebody has to keep monitoring it.
The court’s query came after the status report filed by the Delhi government stated that further URLs containing the offending images/ clips have been uploaded on Youtube, Telegram and Instagram. Both Facebook and Google submitted that there are protocols for preventing child pornography and they will file a comprehensive affidavit disclosing the same by next hearing.
There has been an increase in the trade of illicit content including use of the dark net. Therefore, the global response to internet child pornography and safeguarding children from sexual abuse requires a collaborative strategy and standardisation of domestic legislation across the world. The International Centre for Missing and Exploited Children (ICMEC), in its 2018 report on ‘Model Legislation & Global Review’, studied a set of criteria to gain full understanding of national legislation of 196 countries into six parts — definition of child and child sex abuse material (CSAM), offences, mandatory reporting, industry responsibility, sanctions and sentencing and law enforcement and data retention.
The Indian law is at par with the model law as far as definition of ‘child’ is concerned. It is true that while a person under the age of 18 may be able to freely consent to sexual relation, such an individual is not legally able to consent to any form of sexual exploitation, including CSAM. Therefore, defining ‘anyone under the age of 18 years as a child’ across the globe is a welcome move. The model law also requires the term “CSAM” to be defined separately rather than “child pornography” to more accurately describe the criminal nature of such material and to avoid any confusion regarding consent. It should also include technology-specific terminology, which India does.
The model law requires that ‘knowing possession’ and ‘knowingly downloading or knowingly viewing’ should be an offence. The IT Act (Section 67-B) says that whoever ‘collects, seeks, browses, downloads’ child pornography is an offender. Whether the act is done accidentally or knowingly is left for court’s interpretation as there is a vital difference between inadvertently viewing an image and actively downloading. The POCSO Act punishes only those who store child pornographic material for commercial purposes. This caveat of ‘commercial purposes’ must go and mere possession of CSAM should be made a criminal offence. Similarly, offering information on where to find CSAM by providing a website address should also be criminalised, which is missing at present.
Another parameter of the model law is mandatory reporting of CSAM by the ISPs. ISPs are the channels through which proliferation of CSAM activities take place. It is, therefore, crucial that ISPs report illicit contents discovered on their networks to law enforcement agencies or another mandated agency as soon as they become aware of it. However, in India, the intermediaries are not responsible for communicating third party information to any agency under the current law. In the Shreya Singhal case (2015), the Supreme Court (SC) held that either a court order or notification by the appropriate government or its agency is a must for the ISP to remove or disable access to illicit material. Thus, ISPs are not suo motu responsible for notifying the law enforcement agencies of any CSAM it carries through its channels.