In early 2014, when mass gatherings of protesters in Ukraine, Kiev received text messages saying, “Dear Subscriber, you are registered as a participant in mass riot”, the anonymous mass of people had suddenly turned into personally identifiable individuals.

This potential of accurate tracing and pinpointing of location sent shock waves across people and why not as this is a modern age of surveillance, where mobile phones work as an insight into the life of an individual. Now let us compare this case-scenario with what is happening in the United States, where there is news that executives in Minnesota are using the Covid-19 Tracing Application (SafeDistance) to track down those against the killing of African-American George Floyd.

Now, this raises questions of unparalleled nature, the foremost of which pertains to privacy. When the Covid-19 tracing application was launched by the State to keep a check upon the suspected patients, how ethically justified would it be to use the information for tracking protestors. The second concern pertains to data privacy, the idea of purpose-limitation. The Global Data Protection Regime, under the lineage of GDPR have accepted ‘Purpose Limitation’ as the core principle of data protection.

The basic premise of ‘Purpose Limitation’ is based on the principle that every piece of information which the data controller (Data Fiduciary in India) collects from the data subject (Data Principal in India) shall only be used for the purpose for which it was collected. In other words, if the information was collected by the Covid-19 tracing application for tracing a suspected patient, it can only be used for this purpose. Surely, it cannot be used as an investigative tool by the executive. This brings us to the question of how the use of the mobile application could be restricted in a way that it does not become an investigative tool in the hands of the executive.

Because this surely is a threat which a lot of countries, having mobile applications such as these, are facing today, and the interceptive mechanism of the application is such that it will become a basis for investigation. The basic mechanism behind all Covid-19 tracing applications (floated across the globe) is that of Machine Learning (ML) (Machine Learning is a subset of Artificial intelligence). The modus operandi of Machine Learning is based on cause and effect relationship where the Artificial Intelligence is fed with categorised information, and it reacts in a particular way to that information. It does not require any human intervention, per se, in its working. This digitized working of the mechanism renders it immune from human fallacies, which brings in a lot of creditability and trust on the Artificial Intelligence. However, the result produced by Machine Learning, like the Tracing Applications, is used by human agencies for making decisions. Collection of data, tracing of movement and other meta-data collected by the Covid-tracing mobile application are the work of Artificial Intelligence; yet the actions that follow fall within the jurisdiction of a human agency.

So, where do we locate ‘due process’ in this scenario? ‘Due Process’ as understood in its traditional construct signifies principle of natural justice, the first and foremost condition of which is to ensure that there is procedural due process. Procedural due process entails that the means adopted for application of a law should aligned with the principles of natural justice, and adequate notice is to be given to the individual before proceeding against him/her. However, the traditional construct of ‘Due Process’ is completely dismantled when applied to the phenomenon of an Artificial Intelligence-driven mechanism. This applies specially to an app like Aarogya Setu and other similar applications, where there is a composite of Artificial Intelligence and Human Agency.

Under this construct, adequate compliance of ‘Due Process’ will mean that the action taken by the executive is completely in pursuance of the objective for which the information was collected by the application; which by default means ‘Purpose Limitation’. The fact that Covid tracing applications are collecting information for the sake of tracing suspected patients and the individual giving the information is also notified that it will be used only for tracing suspected patient, the same information cannot be used for tracing protesters at a rally. This would go beyond the legitimate notice served and the consent given. Misuse would procedurally malign the whole process, which by default is breach of the ‘Due Process’ of Law. In the case of Aarogya Setu and other apps like these, where there is a composite of Artificial Intelligence and Human Agency, it becomes essential for the human agency to ensure compliance with ‘Due Process’.

(The writer is Assistant Professor of Law, National Law University, Jabalpur)