Credit: DH Illustration
On January 3, 2025, the long-awaited Digital Personal Data Protection (DPDP) Rules were finally published, 16 months after the DPDP Act was enacted in August 2023.
However, the new rules fail to address many of the ambiguities and concerns surrounding the Act. Additionally, some provisions in the rules appear to undermine rather than protect individual privacy, raising further doubts about their effectiveness.
This article attempts to cover some of the key concerns regarding the impact of the rules on an individual’s privacy and rights. There are additional areas of concern that require further debate and deliberation.
One of the most glaring issues with the DPDP Rules is their failure to clarify critical aspects of the DPDP Act, despite a gap of 16 months.
For instance, the concept of “informed consent” (Rule 2) remains a contentious issue, particularly in a country like India where digital literacy is limited.
To substantiate, the Centre for Economic and Social Studies (CESS) found that on an average, only 12% of individuals over 15 years of age in India are computer literate.
While the Act and rules seek to ensure users’ consent to data processing, the question remains: can individuals truly understand and make informed decisions about their data when many lack access to and basic knowledge of how digital platforms operate?
Moreover, withdrawing consent—often touted as a user’s right—is far from straightforward. Does it mean that all of a user’s data is erased? And how can users verify that their data has indeed been deleted?
Even more troubling is the ambiguity around the “right to be forgotten”. The idea that individuals should be able to have their personal data erased upon request sounds promising in theory.
However, the rules do not clarify how this right can be effectively exercised, especially given the lack of clear procedures and enforcement mechanisms. Data retention, it seems, continues to take precedence, leaving users’ personal information vulnerable for longer periods than they may be comfortable with.
Further complicating the issue are provisions related to data breaches and security safeguards. The rules mention that data fiduciaries must implement “reasonable security safeguards”, but these safeguards remain vaguely defined and peppered with the word “adequate”.
What exactly constitutes “adequate” or “reasonable” security? If an intermediary suffers a breach despite “adequate” measures, will it be penalised for non-compliance? The rules do not provide clear answers, creating uncertainty for both companies and individuals about the repercussions of a breach.
Rule 10 contains details on processing children’s data, which also raises significant concerns.
According to the rules, the data fiduciary is required to verify the consent of a child’s guardian if the child discloses their age. However, this creates a potential minefield of privacy risks when one thinks about how this may be enforced.
For one, the child would have to proactively notify the data fiduciary of their age, and the guardian would need to share identity verification details. What happens if the child fails to inform the fiduciary that they are a minor? This rule inadvertently places a heavy burden on all users to prove their identity, potentially violating personal data privacy in the process by mandating verifiable identity verification for all users.
One of the provisions in the DPDPA that has been heavily challenged are the blanket exemptions for the central government. The rules continue to provide the government with sweeping access to personal data under the pretext of protecting “the sovereignty and integrity of India or security of the State”.
This provision, which grants the government presumably unfettered access to citizens’ data citing vague reasons, remains unchanged from the original Act and continues to lack any meaningful limitations or safeguards. It raises questions about the balance between national security and individual privacy rights, a balance that remains dangerously tilted in favor of government surveillance.
It is also important to highlight two key exemptions in the rules and the Act. First, Rule 15 exempts data used for “research, archiving, or statistical purposes,” but does not provide further definitions or limitations for these categories.
Second, Clause 3(c)(ii) of the Act exempts all publicly available data, which opens the door for scraping such data to develop AI tools. This includes, for instance, the use of publicly available social media images for training facial recognition technologies. These broad exemptions raise concerns about the potential misuse of personal data without adequate safeguards.
One then turns to governance mechanisms for ensuring privacy. While Rule 7 outlines the process for intimation after a data breach has happened, there are no provisions for an independent body to oversee regular audits or for grievance redressal. Rule 13(3) states that data fiduciaries and consent managers will have their own grievance redressal systems and to ensure “the effectiveness of the system in responding within such period, implement appropriate technical and organisational measures.” How does one quantify what is “appropriate”? Such details are left to the discretion of those this Act is meant to govern.
Moreover, while the DPDP Rules are open for public consultation, they are currently only available in English and Hindi, which severely limits the ability of most citizens to engage with the consultation process. In a country as diverse as India, with many regional languages, this narrow accessibility undermines democratic participation.
The DPDP Rules, as they stand, represent a missed opportunity to strengthen (or rather, introduce) data privacy protections in India. While they may have been drafted with the intention of safeguarding citizens’ personal data, they fall short in multiple key areas, leaving individuals exposed to potential misuse of their information.
The government must address these concerns and take immediate steps to make the rules clearer and ultimately more protective of individual privacy.
(The author is an assistant programme manager at the Takshashila Institution)