Logo

Logo

Air travel and premature load-bearing

The Centre’s Digi Yatra scheme which is slated to be rolled out from August this year across two of India’s largest airports by domestic passenger volume – Bengaluru and Varanasi

Air travel and premature load-bearing

Representation image

The Centre’s Digi Yatra scheme which is slated to be rolled out from August this year across two of India’s largest airports by domestic passenger volume – Bengaluru and Varanasi – proposes to authenticate passengers using facial recognition technology (FRTs) thus dispensing with the need for physical boarding permits or identity cards at airport terminals.

The biometric-based digital processing is touted to facilitate self-bag drop and automatically checkin passengers all in a bid to lessen queues and reduce wait times. The use of FRTs for air travel holds immense promise. It has the potential to be a real game-changer for the Indian aviation sector. While interest in the initiative is understandable, there is also sufficient cause for exercising cautious enthusiasm since several lingering questions remain unanswered.

The first concern on FRT deployment relates to the non-legality of FRT in India and the resultant inadequate safeguards for non-discrimination. As of now, FRTs are operating in a legal vacuum in India. The only governmental document mentioning FRT is a cabinet note from 2009, which is not considered law per se.

Advertisement

This points to a lack of adequate procedural safeguards for air travellers for any issues arising out of the use of FRTs. For example, AAI traffic data for Delhi airport (DIAL) recorded 63,08,530 domestic and international passengers in December 2019. Even a low inaccuracy rate of 2 per cent can potentially exclude more than 4,000 DIAL passengers daily from accessing a crucial public service.

This DIAL hypothetical is the story of FRT implementation in just a single airport for one day; at present, there are over 120 airports managed by AAI, with Tier-I city airports accounting for an astounding 96 per cent of the pre-pandemic air traffic volumes! Moreover, it is well documented that FRTs can be even more inaccurate in identifying people of different colours, races, genders, etc. Hence, hasty deployment of FRTs at seven major domestic airports of Delhi, Bengaluru, Hyderabad, Kolkata Varanasi, Vijayawada, and Pune may exacerbate discrimination against the aforementioned categories of individuals.

Secondly, the Digi Yatra policy as well as the Passenger Charter of Rights, both issued by the Ministry of Civil Aviation (MoCA) are also silent on the safeguards and remedies available in case boarding is denied to a passenger, due to a technical error on part of the FRT, despite holding a valid ticket. Interestingly, the Charter only holds airlines responsible for delays, denials, and cancellations and not the airport authorities themselves for any fault(s). This is pertinent since FRTs are to be employed by the Government rather than by individual airlines.

The question to ask here: Will disgruntled passengers have the option to hold the DGCA/MoCA liable for faults in the FRTs? Clarity on the rights of passengers in such a scenario by the concerned authority on this matter would be a good first step. Conceptually, a rushed nationwide deployment of an emerging technology sans sound legal backing and adequate safeguards for non-discrimination is what scholars like Lant Pritchett and others have referred to as ‘premature load-bearing’. The imagery invoked by this line of thought is that of a scaffolding of a bridge being mistaken for the bridge itself. The analogy goes that if one were to drive a truck on the scaffolding, it would collapse thus defeating the purpose and undermining the progress so far.

The use of FRT in India is also claimed to be a replication of a similar scenario. Thirdly, on a separate note, as stated by Professor Karl Manheim and others, AI/ML-based decision-making (such as the ones used in FRTs) is often likened to a proverbial ‘black box’ wherein the exact and actual reasoning process used may be ‘unknown’ and even ‘unknowable’. Not understanding why certain people are denied entry by FRTs while others are, also goes against the well-established ‘transparency’ and ‘accountability’ prongs of the FATE framework (Fairness, Accountability, Transparency, and Ethics) for AI/ML-based technologies.

The EU and UK have come up with an ingenious way to deal with such opaqueness, as nowadays there are talks of a ‘right to explanation’ in the reasoning of an AI whenever a case goes for dispute before a court of law. A similar right for India will be a much welcome step. Fourthly, while on the public front ‘speed and efficiency’ have been the MoCA’s watchword of choice in justifying the rationale underpinning the Digi Yatra ecosystem, a closer perusal of the 276th report of the Parliamentary Standing Committee on Transport, Tourism and Culture reveals that the demands for grants by the ministry for the scheme were premised on and approved by the committee on the logic of safety and security.

While deposing in front of the aforesaid committee, even the Director-General of the Bureau of Civil Aviation Security admitted that “So far as your concern regarding anti-national persons and terrorists passing through is concerned, there is provision in the (Digi Yatra) system. If you feed the photographs of suspected persons and that person passes through system, signal will be there and the security personnel are then alerted.” First mooted in August 2018, the Digi Yatra policy text underwent several revisions.

The latest version of the policy (i.e., version 7.5) was released in March 2021. As also highlighted by civil society organizations, guideline 9 (a) of the ‘Personal Data Guidelines’ mentions that the Digi Yatra-Biometric Boarding System compulsorily has the ‘right to change the data purge settings based on security requirements on a need basis’. The policy did not disclose a threshold for such a ‘need’. Moreover, read with guideline 11, which states that the platform may provide ‘any security agency, or other government agency access to passenger data’, the section makes for vague and ambiguous guidance for whom the data may be shared with.

Furthermore, with the PDP bill also gathering dust, the present legislative frameworks provide little safeguards for commonfolk against unauthorized access, use and/or disclosure. In conclusion, while public interest in the initiative is understandable, there is also sufficient cause for exercising cautious enthusiasm.

If left unaddressed these issues may throw a spanner in the very initial phase of an otherwise successful scheme’s operation. While it is undeniable that the future of air travel will involve the increasing incorporation of technological solutions for a seamless travel experience, a more nuanced approach toward biometric privacy through the introduction of a new comprehensive statutory framework for the governance of FRTs is the need of the hour before attempting a full-fledged, nationwide deployment.

Advertisement