ADVERTISEMENT
Are Indians at risk of unchecked data profiling?An omission from the DPDP Act raises concerns on data misuse with far-reaching implications.
Ritvij Ratn Tiwari
Last Updated IST
<div class="paragraphs"><p>Credit: DH Illustration</p></div>

Credit: DH Illustration

In Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, Shoshana Zuboff warns of a world where data becomes the raw material for manipulation, turning individuals into predictable products of their digital selves. This reality is no longer confined to dystopian speculation; it has arrived, most recently highlighted by Clearview AI’s controversial practices.

In September this year, the US-based facial recognition company was fined €30.5 million ($33.7 million) by the Dutch Data Protection Authority (DPA). Clearview AI was penalised for creating an illegal database of billions of images scraped from the internet without consent, a clear violation of the European Union’s General Data Protection Regulation. This enforcement action raises critical questions for India as it navigates its new data protection law, especially regarding the handling of personal data processed outside its borders.

ADVERTISEMENT

The Digital Personal Data Protection (DPDP) Act in India claims to have extraterritorial reach. Section 3(b) states that the law applies to the processing of digital personal data outside India if such processing is in connection with the offering of goods or services to individuals within the country. On paper, this ensures that foreign businesses targeting Indian citizens remain within the law’s ambit.

However, what stands out is the removal of a critical idea, profiling. Earlier drafts of the bill extended coverage to the profiling of Indian citizens regardless of whether goods or services were offered. The final version passed by Parliament has omitted this provision. Notably, the rules for the implementation of the Act are on the verge of being notified, which brings renewed urgency to address these omissions.

Profiling is the process of analysing personal data to predict behaviours, preferences, or interests. It is not just a privacy issue but an existential risk, as articulated in works like Nick Bostrom’s Superintelligence. Unchecked data analysis can enable powerful actors to simulate and predict societal behaviours, paving the way for manipulation that transcends the economic sphere into political and ethical domains.

Bostrom warns of the unforeseen consequences when technologies develop beyond regulatory and ethical oversight. In India’s case, the omission of profiling allows foreign entities to conduct data analysis for non-commercial purposes, free from Indian oversight, a loophole that could have significant anthropic implications. The question must be asked whether the act fails to foresee these risks and whether we are failing the reversal test. Would we accept such data vulnerabilities if they were being newly introduced rather than being unaddressed by omission?

Why does this matter? The company built its business by scraping public images from social media and other platforms to create a massive facial recognition database. Under GDPR, regulators had the power to step in, penalising Clearview AI for violating people’s data privacy rights. What makes the GDPR particularly significant is its extra-territorial reach. Any company processing the data of European citizens, no matter where it operates, must comply. Moreover, GDPR takes a firm stand against profiling, making it illegal without informed consent or a valid legal reason.

In contrast, India’s DPDP Act, with its omission of profiling, would allow similar practices to go unchecked. If Clearview AI were to profile Indian citizens without offering any direct goods or services, it would technically escape Indian law’s reach. This creates a dangerous scenario where foreign actors, whether non-profits, think tanks, or political agencies, can analyse Indian citizens’ data to predict behaviours, influence opinions, or target individuals without any legal accountability. Such scenarios evoke concerns raised in the literature on whole brain emulation and human enhancement ethics, where data-fuelled simulations are hypothesised to model behaviours in unprecedented ways, offering power without accountability.

Risks and regulation

The omission of profiling also reflects a larger issue within India’s data protection regime. Its narrow focus on commercial activities, erroneously, assumes that all non-commercial data processing is inherently benign. History has shown that data can be weaponised for misinformation campaigns, political manipulation, and surveillance. The Cambridge Analytica scandal in the United States and United Kingdom demonstrated the immense power of data profiling in influencing elections.

In India, where data privacy awareness is still growing, such vulnerabilities are even more pronounced. Moreover, the omission raises questions about Parliament’s intent. Was it a deliberate choice to appease certain stakeholders, or was it a legislative oversight? Earlier drafts of the bill clearly recognised the risks posed by profiling, so its absence cannot be ignored as a minor technical change. This shift leaves the door open for foreign entities to exploit Indian data without fear of legal repercussions, undermining the very purpose of a data protection regime.

If we consider data as a resource analogous to energy or labour, India’s failure to regulate profiling mirrors broader ethical concerns raised in discussions on human enhancement and transhumanism, the unchecked application of tools without evaluating long-term societal consequences. The anthropic principle reminds us that our existence within a technological paradigm demands responsibility in addressing known risks. The omission is not neutral; it is an act that invites exploitation.

To address these shortcomings, India must look to global best practices. The European Union’s GDPR and China’s Personal Information Protection Law offer valuable lessons. Both frameworks regulate profiling explicitly, ensuring that individuals have control over how their data is used. India must reintroduce provisions to regulate profiling, closing the loophole created by its omission. Additionally, the law’s extraterritorial scope must be broadened to include all processing of Indian citizens’ data, regardless of its purpose or commercial intent.

The Clearview AI case serves as a cautionary tale. It highlights the risks posed by unchecked data processing and the power of robust regulations in holding violators accountable. India cannot afford to be complacent in this digital age, where data transcends borders and becomes a tool for influence and manipulation. The DPDP Act, while a step forward, must be strengthened to address these emerging challenges. By reintroducing provisions on profiling and expanding its extraterritorial reach, India can create a comprehensive framework that truly protects its citizens’ data.

(The writer is a final year student at the National Law School of India
University, Bengaluru)

ADVERTISEMENT
(Published 19 December 2024, 04:13 IST)