India's New Data Rules: A Missed Chance for Worker Protection
On November 13, the Indian government notified the Digital Personal Data Protection Rules, 2025, via G.S.R. 846(E). These rules, alongside the Digital Personal Data Protection Act, 2023, aim to establish a robust data governance regime. They set standards for consent notices, data fiduciary duties, security practices, grievance timelines, and the designation of Significant Data Fiduciaries. While this marks substantial progress for India's digital economy, the rules represent a profound missed opportunity regarding worker-specific protections.
The Expanding Landscape of Workplace Surveillance
Across white-collar, blue-collar, and platform sectors, Indian employers increasingly rely on a suite of technologies to monitor their workforce. These include biometric attendance systems such as fingerprint, iris, and facial recognition. Employers also utilize GPS-based location and route tracking, keystroke and screen-time loggers, and automated résumé filters. Hiring algorithms and AI-driven shift allocation, rating systems, and incentive optimization are also becoming commonplace. These systems directly shape workers’ hours, earnings, evaluations, and overall livelihood security.
Gaps in Worker Data Rights
A significant concern is the broad interpretation of "employment purposes." Section 7(i) of the DPDP Act permits employers to process worker data without explicit consent for employment-related reasons or to safeguard against loss. The new Rules do not define or narrow this phrase, creating a risk of "function-creep" where data collected for routine operations can be repurposed for monitoring or disciplinary control. A clear example of this is the use of EPFO/UAN data for dual-employment checks.
Furthermore, worker access and correction rights are heavily restricted. Under Sections 11-12 of the Act, a worker can access or correct their data only if they previously provided consent. However, most workplace data processing occurs under the non-consent clause of Section 7(i). This effectively denies workers access to their productivity metrics, the ability to inspect or correct algorithmic profiles, or to review logs used in disciplinary actions.
Safeguards for Automated Decisions Absent
Neither the DPDP Act nor the accompanying Rules provide essential safeguards for automated decision-making at work. There is no right to an explanation for algorithmic outcomes, no requirement for meaningful information about the logic behind these decisions, no provision for human review, and no clear appeal process against algorithmic judgments. The only mention of algorithms appears in Rule 13(3), applicable only to Significant Data Fiduciaries, leaving most employers’ automated decisions in hiring, pay, scheduling, and performance management largely unregulated.
No Sensitive Data Tier
Notably, the DPDP framework lacks a specific category for "sensitive personal data." Biometric identifiers, which are intimate in nature and widely used in workplaces, receive no heightened protection. This contrasts sharply with global standards.
Missed Collective Redress Mechanisms
All data protection rights and grievance procedures outlined are individualized. While Rule 14 sets timelines for acknowledgment and resolution, it fails to include a collective complaint mechanism. It grants no standing to unions or worker associations and does not mandate employers to consult with workers before deploying monitoring technologies. This is a critical omission in an era where such systems often affect entire workforces.
Compared to global benchmarks like the EU's GDPR, which treats biometrics as special category data and protects against solely automated decisions, or the EU AI Act designating employment AI as high-risk, India's DPDP framework appears strikingly silent on these specific workplace risks. The California CPRA also establishes a distinct sensitive personal information category.
The Path Forward
Interventions are possible through delegated legislation, codes of practice, or guidance from the Data Protection Board. These could include defining and narrowing "employment purposes," extending access and correction rights to all worker data, mandating safeguards for significant automated decisions (including logic transparency and human review), creating a sensitive data tier for biometrics, and enabling collective complaints by unions.
Impact
These rules, while advancing data governance, leave India's workers exposed to unregulated digital and algorithmic harms. The current framework falls short of constitutional privacy jurisprudence emphasizing dignity, autonomy, and proportionality, particularly concerning opaque automated systems influencing employment. Without further evolution through delegated legislation and regulatory guidance, Indian workplaces may not fully meet the promise of fairness and oversight in the data age.
Impact Rating: 6/10
Difficult Terms Explained
- Digital Personal Data Protection (DPDP) Act: India's primary legislation designed to govern how personal digital data is processed.
- Data Fiduciary: An entity that alone or jointly with others determines the purpose and means of processing personal data.
- Significant Data Fiduciary (SDF): A data fiduciary or group of fiduciaries identified by the government based on the volume and sensitivity of data processed and the potential risk to individuals' rights.
- Function-creep: The phenomenon where data collected for an initial, specific purpose is subsequently used for additional, unrelated, or more intrusive purposes without explicit consent.
- Algorithmic management: The use of algorithms and artificial intelligence to oversee and direct various aspects of work, including scheduling, performance assessment, and hiring decisions.
- Biometric data: Information relating to an individual's unique physical, physiological, or behavioral characteristics, such as fingerprints, facial features, or iris patterns, used for identification.
- GDPR (General Data Protection Regulation): A comprehensive data privacy law enacted by the European Union governing the protection of personal data and privacy.
- EU AI Act: A proposed regulation by the European Union aimed at governing artificial intelligence systems, categorizing them based on their risk level.
- CPRA (California Privacy Rights Act): An amendment to California's existing consumer privacy law, introducing enhanced data privacy rights for state residents.
- Proportionality test: A legal principle that requires measures taken to be appropriate and no more than necessary to achieve a legitimate objective.