NEW DELHI: Microsoft Corporation and Google LLC have argued before the Delhi High Court that it is not possible for Search Engines to automatically identify and take-down intimate images uploaded on the internet without the consent of the victim.
The submission was in their challenge to an order directing them to take down non-consensual intimate images (NCII) within 24 hours, failing which they would risk the loss of safe harbour protections under Section 79 of the Information Technology Act, 2000 (IT Act).
In a plea by them in this regard, the Delhi High Court had ordered Google and Microsoft to file a review petition against the order if they cannot comply with it.
The rationale behind the order was that access should be promptly disabled by the search engines themselves without requiring the victim to provide specific URLs for each violation separately.
However, Senior Advocate Jayant Mehta submitted to the Court,
Technology is evolving but we have not reached that stage yet. To say that you [search engines] are required to do it today otherwise your immunity is gone, that cant be. It is work in progress. I am endeavouring to reach it. But to say that I must do it today is not fair.
The Courts order, which directed search engines to issue a unique token upon the initial takedown of NCII, was aimed at reducing the burden on victims of NCII images, to reproduce links of the circulating violative content.
The rationale behind this was to provide a solution to the issue of victims having to keep track of, and repeatedly approach authorities with, specific URLs every time this content resurfaced.
The Court also pointed out that search engines failure to take down the violating content within 24 hours defeats the object of the IT Act and rules.
It therefore instructed for the assignment of unique tokens and that if the token resurfaced, the search engine would then be able to use pre-existing technology to take it down.
The Court also recommended using the hash-matching technology, and the development of a trusted third-party encrypted platform for registration of NCII content.
This too was aimed at reducing the victims burden. However, Google argued that the automated tools cannot pick out consent or the absence of it - the very foundation of non consensual intimate images.
Both Microsoft and Google said that this can therefore lead to the takedowns of consensually-shared sexual content, leading to privatized censorship and impingement of the freedom of speech.