Our resources provide the essential tools, guides, and insights to help your business stay ahead of data privacy regulations. From practical templates to expert articles, we ensure you have everything you need to navigate compliance with confidence.
Table of content
Last Updated: 2026-03-16 ~ DPDP Consultants
India's Digital Personal Data Protection Act has arrived —
and for EdTech platforms sitting on years' worth of student learning data, it's
a genuine wake-up call. Here's what you actually need to know.

Let's be honest about something: the EdTech industry built its growth on data.
Every quiz attempt, every video pause, every re-read, every purchase click —
it's been tracked, stored, and in many cases monetised or fed into algorithms
that shape what students see next. For a long time, this happened in a
regulatory vacuum.
That vacuum no longer exists.
India's Digital Personal Data Protection (DPDP) Act,
2023 — the country's first comprehensive data protection law — has
moved from Parliament to the implementation phase. With Rules under active
notification and enforcement timelines taking shape, EdTech platforms are now
sitting on an uncomfortable question: what do we do with all the data
we've been collecting?
This blog breaks it down plainly. Not in legalese, but in
the practical terms that matter for founders, product managers, compliance
teams, and educators building platforms that touch millions of student lives.
1. What Exactly Is the DPDP Act?
The Digital Personal Data Protection Act 2023 is India's
landmark legislation governing how organisations collect, use, store, and
process the personal data of Indian residents. It applies to digital personal
data — data collected online, or data collected offline but later digitised.
Think of it as India's answer to the EU's GDPR, though with its own distinct architecture. It centres on a few core ideas:

For EdTech, the Act creates a category of heightened concern: "children's data" — defined as data of individuals under 18 years of age. This is not just an afterthought. It's one of the most stringent parts of the entire law, and it sits squarely at the heart of what EdTech does.
Key term: Under the DPDP Act, organisations processing data are called Data Fiduciaries. Individuals whose data is processed are Data Principals. If you run an EdTech platform, you are a Data Fiduciary the moment you collect so much as an email address from a user.
2. What Data Does EdTech Actually Collect?
Before we get into obligations, it's worth stepping back and
honestly cataloguing what data EdTech platforms typically hold. The list is
longer than most people realise.
|
Data
Category |
Examples |
Sensitivity
Level |
Common Use |
|
Identity
data |
Name, email,
phone, date of birth |
Medium |
Account
creation, login, communication |
|
Demographic
data |
Age, gender,
city, school/college name |
Medium |
Personalisation,
market segmentation |
|
Learning
behaviour data |
Time on page,
quiz scores, video completion %, re-attempts |
Lower |
Adaptive
learning, progress tracking |
|
Financial
data |
Payment
history, subscription tier, EMI records |
High |
Billing,
upselling |
|
Device
& technical data |
IP address,
device ID, browser, OS, location (GPS) |
Medium |
Anti-fraud,
analytics |
|
Biometric/sensitive
data |
Face ID for
proctoring, voice recordings |
High |
Exam
integrity |
|
Parent/guardian
data |
Parent name,
contact, consent records |
Medium |
Parental
controls, billing for minors |
|
Social
data |
Posts in
forums, peer-group interactions |
Lower |
Community
features |
What makes EdTech uniquely complex is the combination
of who the data is about (minors, in many cases) and how much of
it is generated. A student who spends two hours on an online learning platform
generates hundreds of data points — far more granular than, say, a Netflix user
watching a film. This level of behavioural data is precisely what makes EdTech
products effective — and what makes DPDP compliance so consequential.
3. Key Obligations for EdTech Platforms Under DPDP
The DPDP Act places a detailed set of obligations on Data
Fiduciaries. For EdTech companies, these translate into some concrete,
practical requirements that touch nearly every part of the product and
business.
Lawful basis for processing
Under the Act, personal data can only be processed under
a valid lawful basis. For most EdTech use cases, this means either
explicit consent from the user, or one of the "legitimate uses"
specified by the government (such as processing for employment purposes or
certain public interest functions). The catch: "we always collected
it" is not a lawful basis. Neither is a pre-ticked checkbox buried in 30
pages of terms.
Notice requirements
Every time you collect personal data, you must provide a
clear, plain-language notice that tells the user:
Data Minimisation in practice
This is the one that will force the most product-level
changes. EdTech platforms have historically collected data "just in
case" — building large data lakes on the assumption that more data means
better product. The DPDP Act directly challenges this. You may only collect
data that is adequate, relevant, and necessary for the
declared purpose.
Storage limitation and data deletion
Once the purpose for which data was collected is fulfilled,
the data must be erased. This is a structural challenge for platforms that have
been storing user data for years. Old inactive accounts, historical quiz
performance data, payment records from discontinued subscriptions — all of this
needs a defined lifecycle with clear deletion timelines.
Heads up for product teams: Storage limitation
is not just a backend concern. It requires a product decision: what is the
declared purpose of each data point you collect, and when is that purpose
"served"? Many EdTech platforms have never formally answered this
question. Now they must.
4. The Consent Problem — Especially for Minors
If there's one aspect of the DPDP Act that should keep
EdTech leaders up at night, it's the provisions around children's data. The Act
defines a child as any individual under 18 years of age, and the rules for
processing their data are significantly more stringent than for adults.
|
Scenario |
Adult User
(18+) |
Minor User
(under 18) |
|
Who gives
consent? |
The user
themselves |
Parent or
legal guardian |
|
Standard of
consent |
Informed,
free, unambiguous |
Verifiable
parental consent required |
|
Behavioural
tracking |
With consent |
Restricted —
must not cause detrimental effect on child |
|
Targeted
advertising |
With consent |
Prohibited |
|
Profiling |
With consent
+ purpose |
Prohibited |
|
Age
verification |
Self-declaration
(plus platform duty of care) |
Robust age
verification + parental consent verification |
Let that sink in for a moment. If a significant portion of
your user base is under 18 — and for platforms serving K-12, test prep, or
competitive exam students, this is almost certainly the case — you are
prohibited from using their data for behavioural targeting or profiling. That's
not just a product feature. For many EdTech business models, it touches
monetisation strategy directly.
What counts as "verifiable parental consent"?
The Act mandates that consent from a parent or guardian must
be "verifiable." The Rules are expected to spell out what
verification mechanisms are acceptable. Currently, likely approaches include
Aadhaar-based eKYC for parents, OTP verification linked to a guardian's
identity, and digitally signed consent forms. Whatever mechanism is used, it
must be robust — a simple checkbox labelled "I confirm I am the
parent" will almost certainly not meet the standard.

The "age-gating" problem
Many EdTech platforms today do not rigorously verify user
age. A 15-year-old can sign up with a self-declared birthdate of 2000 and
access a platform as an adult. Under the DPDP Act, this is no longer acceptable
— platforms have a duty of care to implement reasonable technical and
procedural measures to ensure age is accurately verified. Failing to
do so doesn't transfer liability to the user; it remains with the platform.
5. Student (and Parent) Data Rights Under DPDP
The Act grants data principals a set of enforceable rights
that EdTech platforms must actively support through their products and
processes. These are not aspirational goals — they're legal entitlements.

For EdTech platforms, supporting these rights means building
operational processes around them — not just adding a "Delete
Account" button. A proper data subject rights (DSR) framework includes a
request intake process, identity verification before data is disclosed, defined
response timelines (the Act specifies these), and documentation of every rights
request and outcome.
Importantly, the right to erasure is not absolute. EdTech
platforms may retain data beyond a user's erasure request if there is a legal
obligation to retain it (for example, financial records required under
tax law) or if the data is necessary to comply with a court order. However, the
platform must clearly document and justify any such exception.
6. Penalties — What's Actually at Stake
One thing is clear from the DPDP Act's penalty structure:
this law has teeth. The Data Protection Board of India (DPBI) — the regulatory
authority established under the Act — has the power to impose significant
financial penalties for violations.
|
Violation |
Maximum
Penalty |
|
Failure to
implement reasonable security safeguards leading to a data breach |
₹250 crore |
|
Failure to
notify the Board and affected data principals of a breach |
₹200 crore |
|
Non-compliance
with obligations relating to children's data |
₹200 crore |
|
Non-compliance
with Significant Data Fiduciary obligations |
₹150 crore |
|
Other
violations of the Act or Rules |
₹50 crore |
It's worth noting that these penalties are per
violation, not subject to a single annual cap in the way some other
jurisdictions work. For a platform that has been systematically processing
children's data without adequate consent — across millions of users — the
aggregate exposure can be substantial.
Important context: The Data Protection Board
functions as an adjudicatory body, not just a regulator. Complaints can be
filed by any data principal, and the Board has the power to summon evidence,
examine witnesses, and impose penalties. This is meaningfully different from
regimes where enforcement is entirely regulator-initiated.
Beyond the financial penalties, there are reputational
stakes. In a sector where trust is literally the product — parents are
entrusting platforms with their children's education and data — a high-profile
enforcement action or data breach is not just a legal problem. It's an
existential one.
7. A Practical Compliance Roadmap for EdTech Platforms
Compliance with the DPDP Act is not a one-time project. It's
an ongoing operating discipline. But there's a sensible sequence to getting
started, especially for platforms that are beginning from a position of limited
prior data governance.
Phase 1 — Data Discovery and Mapping (Weeks 1–4)
Audit every data point your platform collects. Map it to the
system where it's stored, the purpose it serves, and the legal basis for
processing it. Include third-party tools (analytics SDKs, ad platforms, CRMs) —
they are your responsibility too.
Phase 2 — Consent Architecture Redesign (Weeks 4–8)
Review and rewrite your privacy notice in plain language.
Build or redesign consent flows — particularly for minor users. Implement
verifiable parental consent mechanisms. Document every consent instance with
timestamps and metadata.
Phase 3 — Data Minimisation & Retention Policy (Weeks
6–10)
For every data category identified in Phase 1, define: is
this collection truly necessary? If yes, how long must it be retained? Build
automated deletion workflows for data that has passed its retention period.
Phase 4 — Security and Breach Response (Weeks 8–12)
Implement or upgrade technical safeguards. Build a breach
detection and response protocol that includes notifying the DPBI and affected
users within the prescribed timeframe. Test your incident response plan.
Phase 5 — Rights Fulfillment Infrastructure (Weeks 10–14)
Build the operational processes to handle data subject
requests — access, correction, erasure, grievance. Train your team. Define
SLAs. Set up a Data Protection Officer (DPO) function if you qualify as a
Significant Data Fiduciary.
Phase 6 — Ongoing Compliance Culture (Ongoing)
Compliance is not a destination. Conduct regular data
protection impact assessments for new features. Review third-party agreements.
Keep up with DPBI guidance and Rule notifications. Build data privacy into your
product development lifecycle from day one.
Third-party
and cross-border considerations
The DPDP Act also has implications for cross-border data
transfers. The Central Government will specify countries to which personal data
may or may not be transferred. For EdTech platforms using global infrastructure
providers (AWS, Google Cloud, Azure), or international analytics and marketing
tools, this requires a detailed review of data flows and contractual
arrangements with vendors.
Furthermore, if you share data with third-party partners —
think affiliate marketing platforms, research organisations, or edtech service
providers — those agreements need to be updated to reflect DPDP obligations.
Your data processors act under your instructions, and their compliance failures
can still be your liability.
8. Turning Compliance Into a Competitive Advantage
There's a tendency to view data protection legislation
purely as a cost centre. That framing is, frankly, shortsighted — particularly
in EdTech.
Consider the trust dynamics at play. Parents choosing an
EdTech platform for their child are making a decision that involves both their
child's academic future and their digital safety. Platforms that can
demonstrate genuine, robust data privacy practices — not just a checkbox
compliance exercise — are building something that their competitors may not
prioritise: institutional trust.
|
Compliance
Action |
Business
Benefit |
|
Transparent,
plain-language privacy notice |
Increases
sign-up conversion — users trust what they understand |
|
Easy consent
withdrawal |
Reduces churn
anxiety — users feel in control, not trapped |
|
Robust
parental consent flows |
Unlocks
institutional and school sales (procurement teams increasingly require this) |
|
Data
minimisation practices |
Reduces
storage costs; reduces breach exposure surface |
|
DPO and
governance function |
Signals
maturity to investors, acquirers, and enterprise clients |
|
DPDP
compliance certification (when available) |
Differentiator
in a crowded market — particularly for B2B EdTech |
There's also a product angle worth considering. The data minimisation
principle, while initially appearing to constrain product teams, actually
forces a more intentional approach to what data truly drives learning outcomes.
Platforms that can improve student results with less data — because they're
using it more intelligently — are building a more defensible product than those
that simply hoover up everything available.
The bottom line: The EdTech platforms that will
win in the next decade are not the ones that collected the most data. They're
the ones that earned the most trust. DPDP compliance is your framework for
doing exactly that.
1. Does the DPDP Act apply to small EdTech startups, not
just large platforms?
Yes. The Act applies to all Data Fiduciaries processing
personal data of individuals in India, regardless of company size. However, the
obligations for "Significant Data Fiduciaries" — those handling large
volumes of sensitive data — are more extensive. The government will notify
which entities qualify. Even so, the core obligations around consent, notice,
and data rights apply to all EdTech platforms from day one of processing
personal data.
2. What if our platform serves both adults and children —
do we need separate systems?
Effectively, yes. Platforms serving mixed-age user bases
need to implement robust age verification at onboarding, and then apply
different data processing rules depending on whether the user is under 18. This
typically means different consent flows, different data retention policies, and
a prohibition on behavioural profiling or targeted advertising for the minor
segment. Many platforms are redesigning their architecture to support this
age-based differentiation.
3. How do we handle data collected before the DPDP Act
came into force?
This is one of the most practically complex questions in
DPDP compliance. Historical data collected without DPDP-compliant consent will
likely need to be either re-consented (i.e., you go back to users and obtain
fresh, compliant consent) or deleted. The Rules may provide transition
timelines and specific guidance on this. Proactively auditing and categorising
your historical data now — before enforcement timelines crystallise — is
strongly advisable.
4. Is there a mandatory Data Protection Officer (DPO)
requirement for all EdTech companies?
The mandatory DPO requirement applies specifically to
"Significant Data Fiduciaries" as designated by the Central
Government. However, even for platforms not formally designated as Significant,
having a named privacy contact and a clear internal governance structure is
considered best practice and will be expected by enterprise clients and
institutional buyers.
5. Can EdTech platforms continue using student data for
AI model training?
This is an emerging grey area. Using student data to train
AI models — including adaptive learning algorithms — must be covered by the
original consent notice. If your privacy notice did not explicitly mention this
use case, you cannot use historical data for it without fresh consent. For new
data, this purpose must be explicitly disclosed and consented to. For
children's data, the prohibition on profiling creates additional constraints on
how AI-generated insights about individual students can be used.
The DPDP Act is not the end of data-driven EdTech. It's the
end of careless data-driven EdTech. Platforms that have been
thoughtful about data — collecting what they need, using it to genuinely serve
students, and being transparent with users about how it's used — will find
compliance relatively straightforward. Platforms that have treated data as a
free resource to be hoarded and monetised however possible face a significant
reckoning.
The good news is that the direction of travel is clear, and
the EdTech sector has both the technical capability and the genuine motivation
— in the form of serving students well — to get this right. Privacy-respecting
EdTech isn't just legally required. It's what parents, students, and educators
deserve.
Start the audit. Redesign the consent flows.
Build the deletion pipelines. Appoint your privacy lead. The platforms that do
this now — before enforcement pressure forces their hand — will emerge as the
trusted names in a sector where trust is everything.
