India's EdTech sector serves over 40 million learners, and a significant share of them are school-age students. If your platform collects names, grades, learning progress, quiz scores, or even device identifiers from anyone under 18, the DPDP Act 2023 (full text) has a dedicated section written with you in mind. It is called Section 9, and it carries the highest penalty ceiling in the entire Act: ₹200 crore per violation.
That is not a typo, and it is not the kind of compliance topic you can push to "next quarter." Here is what Indian EdTech companies need to understand, what they need to change, and where the industry is getting it wrong. For a cross-sector comparison of how these obligations differ from other industries, see our DPDP compliance by industry guide.
Key Takeaways
- Under the DPDP Act 2023, every individual below 18 is classified as a "child." EdTech platforms processing student data must obtain verifiable parental consent before any data collection begins (Section 9(1)).
- Section 9(3) imposes an absolute ban on tracking, behavioral monitoring, and targeted advertising directed at children. This applies regardless of whether parental consent has been obtained.
- Rule 10 of the DPDP Rules 2025 specifies acceptable methods for verifiable parental consent: DigiLocker-based tokens, government-issued ID verification, and OTP-based consent via a parent's registered mobile number.
- The penalty for children's data violations under the DPDP Act can reach ₹200 crore, the highest tier in the penalty schedule.
- EdTech platforms that operate as Data Processors for schools may rely on the school's parental consent, but only if the school obtained that consent in compliance with the Act.
Why Does DPDP Hit EdTech Differently Than Other Sectors?
Most industries processing personal data in India will need to get consent mechanisms right, build Data Principal rights workflows, and maintain compliant records. EdTech inherits all of those obligations, plus an entirely separate layer of restrictions that only apply when the Data Principal is a child.
The reason is structural. EdTech's core users are, by definition, often minors. A B2B SaaS company might incidentally process a few birth dates from HR records; a K-12 learning platform collects detailed behavioral data, learning patterns, assessment results, and sometimes biometric attendance records from students who are 8, 12, or 16 years old. That is a fundamentally different risk profile, and the DPDP Act treats it accordingly.
Three provisions make the EdTech compliance challenge distinct:
-
Verifiable parental consent is mandatory before any processing begins (Section 9(1)). Not a checkbox. Not an "I am above 18" button. Verifiable consent from a parent or lawful guardian, confirmed through a method prescribed by the Rules.
-
Certain processing activities are absolutely prohibited (Section 9(3)). Tracking, behavioral monitoring, and targeted advertising directed at children cannot be performed even with parental consent. This is a hard ban, not a consent-gated activity.
-
The penalty ceiling is higher than for adult data breaches. The Schedule to the DPDP Act prescribes up to ₹200 crore for violations involving children's data, compared to ₹50 crore for general consent violations.
So what does this mean for your EdTech platform's onboarding flow, your analytics stack, and your marketing operations? Let's walk through each.
What Does Section 9 of the DPDP Act Actually Require?
Section 9 is the children's data provision of the DPDP Act 2023. It is short, just five sub-sections, but its implications run deep for any platform where minors are the primary users.
Section 9(1): Verifiable Parental Consent
The Act mandates that a Data Fiduciary must obtain "verifiable consent of the parent or lawful guardian" before processing any personal data of a child. The word "verifiable" is doing heavy lifting here. A self-declaration checkbox that says "My parent agrees" does not qualify.
Rule 10 of the DPDP Rules 2025 prescribes several acceptable methods:
| Consent Method | How It Works | Practical Consideration |
|---|---|---|
| DigiLocker-based verification | Parent authenticates via DigiLocker, issuing a token that verifies identity | Highest assurance; requires parent to have DigiLocker account |
| Government-issued ID verification | Parent submits government ID (Aadhaar, PAN, Voter ID) for verification | Wide accessibility; requires secure ID verification infrastructure |
| OTP-based consent via registered mobile | OTP sent to parent's registered mobile number to confirm consent | Simplest to implement; lower assurance than document-based methods |
| Consent manager mechanism | Parent uses a registered consent manager under the DPDP framework | Consent managers must be registered with the Data Protection Board; operational timeline is November 2026 |
The practical question every EdTech founder I have spoken to asks: "Does this mean a parent has to verify their identity before my platform can even show their child a demo lesson?" In most cases, yes. Section 9(1) does not include a "try before you consent" exception. If the demo lesson requires the student to create an account, enter their name, or generate any personal data, the parental consent obligation is triggered.
Section 9(2): The "Detrimental Effect" Standard
Section 9(2) states that no Data Fiduciary shall process children's data in a manner "likely to cause any detrimental effect on the well-being of a child." This is a harm-based standard, and it applies even when valid parental consent exists.
For EdTech, this provision creates a question around certain analytics practices. If your adaptive learning algorithm determines that a student is "struggling" and serves them progressively easier content to keep engagement metrics high, is that a "detrimental effect on well-being"? The Act does not define the term precisely. As of February 2026, the Data Protection Board has not issued guidance on what constitutes a detrimental effect in an educational context.
What is clear: gamification mechanics designed to maximise screen time, difficulty curves tuned to create frustration loops, and engagement tactics borrowed from consumer gaming apps will face scrutiny under this provision.
Section 9(3): The Absolute Prohibitions
This is the sub-section that will force the most significant product changes across Indian EdTech. Section 9(3) of the DPDP Act 2023 prohibits three activities in relation to children's data:
- Tracking of children across websites or apps
- Behavioral monitoring of children
- Targeted advertising directed at children
These prohibitions are absolute. They cannot be overridden by parental consent. A parent cannot consent to having their child's browsing behavior tracked for advertising purposes, because the Act bars the Data Fiduciary from performing that activity at all.
For EdTech specifically, the behavioral monitoring prohibition collides directly with learning analytics, personalized learning paths, and adaptive assessment engines. The critical question: where does "educational analytics" end and "behavioral monitoring" begin?
Industry interpretation, supported by the limited expert commentary available as of February 2026, suggests that educational analytics used solely to improve learning outcomes and delivered as a feature of the educational service likely fall within the educational purpose for which consent was obtained. But analytics data repurposed for advertising, sold to third parties, or used to build behavioral profiles beyond the educational context crosses the line.
Here is a practical framework for evaluating your analytics:
| Analytics Practice | Likely Compliant | Likely Non-Compliant |
|---|---|---|
| Tracking quiz scores to adapt difficulty | ✅ Educational purpose | |
| Measuring time-on-task for teacher dashboards | ✅ Educational purpose | |
| Building student profiles for content recommendations within the learning platform | ✅ If limited to educational purpose | ⚠️ If profiles are shared externally or used for marketing |
| Tracking student navigation patterns across your website for ad targeting | ❌ Prohibited under Section 9(3) | |
| Sharing learning behavior data with third-party ad networks | ❌ Prohibited under Section 9(3) | |
| Using engagement data to serve personalised product upsells to the child | ❌ Likely constitutes targeted advertising directed at a child |
Note that the distinction is purpose, not technology. The same event-tracking code can serve compliant educational analytics or non-compliant behavioral monitoring. What determines compliance is what you do with the data and who you share it with.
Section 9(4) and 9(5): Potential Exemptions
The Act includes two exemption provisions for children's data:
Section 9(4) allows the Central Government to exempt certain classes of Data Fiduciaries or certain purposes from the requirements of verifiable parental consent and the absolute prohibitions under Section 9(3). As of February 2026, no such exemptions have been notified.
Section 9(5) allows exemption for a Data Fiduciary if the Central Government is satisfied that its processing of children's data is "verifiably safe." This could theoretically provide a pathway for established EdTech platforms to demonstrate safety and receive some relaxation, but the mechanism for applying or qualifying has not been published.
The prudent approach for any EdTech company: build for full Section 9 compliance. If exemptions arrive, you can relax your controls. If they don't, you are already compliant.
How Does This Apply to Different EdTech Business Models?
Not every EdTech company operates the same way, and the compliance burden varies depending on your specific model. I have spoken with founders across the spectrum, from K-12 learning platforms to exam prep companies to school management system vendors. The compliance picture looks different for each.
Direct-to-Student Platforms (B2C EdTech)
Companies like test prep platforms, coding bootcamps for kids, or K-12 supplement apps that acquire students directly are squarely in the crosshairs. You are the Data Fiduciary. You collect the data. You determine the purpose of processing. Section 9 applies in full.
Your compliance checklist:
- Implement a verifiable parental consent flow before account creation
- Build an age-gating mechanism at onboarding to identify users under 18
- Strip all behavioral tracking, ad targeting, and third-party analytics SDKs from student-facing experiences
- Maintain auditable consent records linking each student account to verified parental consent
- Provide parents with a dashboard or mechanism to withdraw consent at any time
The onboarding redesign is the hardest part. If your current signup flow is "enter name, enter email, start learning," you need to insert an age verification step, a parental consent verification step, and a waiting period while consent is confirmed, all before the student touches any content that generates personal data.
School-Facing Platforms (B2B EdTech)
If your platform is deployed through schools (learning management systems, attendance trackers, student information systems), the compliance picture is slightly different but no less demanding.
When a school contracts an EdTech platform, two scenarios exist:
Scenario A: The school is the Data Fiduciary, and you are the Data Processor. If the school determines the purpose and means of processing, and your platform processes data on the school's instructions, you may rely on the school's parental consent. But "rely on" requires verification. You need contractual assurance from the school that compliant consent was obtained, and you need to specify this in your data processing agreement. If the school's consent was deficient, enforcement may reach both of you.
Scenario B: You collect data independently from students. If your platform allows students to log in independently, creates individual student profiles, or processes data beyond what the school's contract covers, you become an independent Data Fiduciary for that processing. Section 9 applies directly to you.
Most B2B EdTech platforms operate in a hybrid model where some processing is clearly on the school's behalf and some is clearly independent (product analytics, marketing to parents, freemium conversions). Drawing the line between Scenario A and Scenario B is essential, and most companies I speak with have not done it yet.
Content Platforms and Marketplaces
EdTech aggregators, tutoring marketplaces, and educational content platforms face a specific challenge: they often claim they don't "collect" student data because the student interacts with an independent tutor or content creator on the platform.
This argument is unlikely to hold under the DPDP Act. If your platform facilitates the collection of personal data, processes it in your systems (even transiently), and determines any aspect of the purpose (matching students to tutors, recommending content, maintaining progress records), you are a Data Fiduciary for that processing.
What Is the Age Threshold, and Why Does It Matter?
The DPDP Act 2023 defines a "child" as any individual who has not completed 18 years of age (Section 2(f)). This is notably higher than many international standards. GDPR sets the threshold at 16 (with member states able to lower it to 13), and COPPA in the United States applies only to children under 13.
India's 18-year threshold means that your typical Class 12 student preparing for JEE or NEET, who is 17 years old, independent enough to choose their own test prep platform, and possibly paying with their own UPI account, is legally a child under the DPDP Act. Their parent or guardian must provide verifiable consent before your platform processes their data.
This has significant implications for exam prep companies, college admission platforms, and skill development programs aimed at high-school students. Your user might look, behave, and transact like an adult, but the law treats them as a child until they turn 18.
| Jurisdiction | Age Threshold for "Child" | Practical Impact on EdTech |
|---|---|---|
| India (DPDP Act 2023) | Below 18 years | All K-12 + most competitive exam prep users require parental consent |
| EU (GDPR) | Below 16 (can be lowered to 13 by member states) | Primarily impacts primary/lower secondary EdTech |
| USA (COPPA) | Below 13 | Only impacts elementary-age EdTech; teen platforms largely unaffected |
| UK (Children's Code) | Below 18 (design code), below 13 (parental consent) | Design obligations broader; consent obligations narrower |
For Indian EdTech founders, this means your compliance surface is much broader than your US or EU competitors face. Every user from age 5 to 17 triggers the full weight of Section 9.
What Should Your Compliance Roadmap Look Like?
If you are running an EdTech platform with users under 18, here is a phased approach that I have seen work for companies of varying sizes.
Phase 1: Data Mapping and Audit (Weeks 1-3)
Before changing any product flows, you need to understand what you are actually collecting. Map every data point your platform captures from student users:
- Account registration data (name, age, email, phone, school, grade)
- Learning activity data (quiz scores, completion rates, time-on-task, content viewed)
- Device and technical data (IP address, device ID, browser fingerprint, location)
- Communication data (chat messages with tutors, discussion forums, support tickets)
- Payment data (if students or parents transact on the platform)
- Third-party SDK data (analytics, crash reporting, advertising, social login)
For each data point, document the purpose and whether it is essential to the educational service. Data points without a clear educational purpose, particularly anything feeding advertising or engagement optimization, should be flagged for removal or redesign.
Phase 2: Consent Architecture Redesign (Weeks 3-6)
Build the verifiable parental consent flow. At minimum, this requires:
-
Age gate at registration: Ask the user's date of birth or age range at the point of account creation. If the user indicates they are below 18, route them through the parental consent workflow.
-
Parent identification: Collect the parent's identity through one of the Rule 10-approved methods (DigiLocker, government ID, or OTP-based verification).
-
Consent recording: Log the parent's identity, the timestamp of consent, the specific purposes consented to, and the method of verification. This is your audit trail.
-
Consent withdrawal mechanism: Parents must be able to withdraw consent at any time, and withdrawal must trigger deletion of the child's data (subject to any legitimate retention requirements). Ensure withdrawal is as easy as granting consent in the first place.
-
Periodic re-verification: While the Act does not mandate periodic re-consent, best practice suggests re-verifying consent at least annually or when processing purposes change materially.
Phase 3: Product and Analytics Clean-Up (Weeks 4-8)
Audit every analytics tool, SDK, and third-party integration in your product for Section 9(3) compliance:
- Remove or reconfigure any ad-tech SDKs that track student behavior
- Ensure Google Analytics, Mixpanel, or equivalent tools are configured to exclude personally identifiable student data, or replace them with privacy-preserving alternatives
- Review recommendation algorithms to ensure they operate within the educational purpose, not for engagement maximisation beyond learning goals
- Disable any cross-site tracking pixels on student-facing pages
- Review data sharing agreements with tutors, content partners, and vendors
Phase 4: Documentation and Training (Weeks 6-10)
DPDP compliance is not just a product problem; it is an organizational one. Ensure:
- Your privacy policy explicitly addresses children's data and clearly describes the parental consent process
- Your data processing agreements with school clients include clear language about consent responsibilities
- Customer support and sales teams understand the parental consent requirements and can explain them to school administrators and parents
- Incident response procedures account for children's data, given the elevated penalty ceiling
For a broader view of regulatory timelines and enforcement deadlines, the DPDP compliance checklist covers the full set of obligations across all business types.
What Happens If You Get This Wrong?
The Schedule to the DPDP Act 2023 prescribes specific penalty tiers. For violations involving children's data, the penalties are the highest in the Act:
| Violation Category | Maximum Penalty |
|---|---|
| Processing children's data in breach of Section 9 | Up to ₹200 crore |
| Failure to implement reasonable security safeguards (resulting in breach) | Up to ₹250 crore |
| General failure to fulfil Data Fiduciary obligations | Up to ₹50 crore |
| Failure to notify the Data Protection Board of a data breach | Up to ₹200 crore |
Note that the ₹200 crore penalty for children's data violations is a per-violation figure. If a platform is found to have systematically processed children's data without valid parental consent across, say, 100,000 student accounts, the theoretical exposure is significant enough to constitute an existential threat to most EdTech companies.
Beyond penalties, consider the reputational damage. In a sector where parents are the paying customers and trust is the primary purchase driver, a publicised enforcement action involving children's data would be devastating. Schools that have deployed your platform would face their own accountability questions. The compounding effects go well beyond the fine.
The Competitive Advantage Nobody Is Talking About
Here is the part that gets lost in the compliance conversation: the EdTech companies that build genuinely strong children's data protection will have a structural advantage once enforcement begins.
Schools are already starting to ask about DPDP compliance in their procurement questionnaires. Parents, especially in metro markets, are increasingly aware that their children's learning platforms collect extensive data. A platform that can demonstrate verified parental consent, transparent data practices, and zero behavioral tracking is a platform that wins trust faster.
I have spoken with school procurement officers who told me they are planning to make DPDP compliance a mandatory vendor qualification criterion by the 2027 academic year. The platforms that are ready will win those contracts. The ones that are scrambling will lose them.
The compliance investment you make now is not just risk mitigation. It is a market positioning play that separates you from competitors who are still hoping the rules will somehow not apply to them.
Frequently Asked Questions
Does the DPDP Act apply to EdTech platforms that only operate within a single school?
Yes. The DPDP Act 2023 applies to any entity processing digital personal data within India, regardless of scale. A learning management system deployed in a single school with 500 students is subject to the same Section 9 obligations as a platform serving millions. The only variable is whether the school or the EdTech platform is the Data Fiduciary for a given processing activity.
Can EdTech companies use adaptive learning algorithms under the DPDP Act?
Adaptive learning algorithms that process student data for educational improvement are likely permissible, provided the data is used solely within the educational purpose for which parental consent was obtained. However, the same data used for behavioral profiling, engagement maximisation beyond learning goals, or third-party sharing would likely violate Section 9(2) and Section 9(3) of the DPDP Act 2023.
What if a student turns 18 while using the platform? Does parental consent still apply?
Once a user turns 18, they are no longer classified as a child under Section 2(f) of the DPDP Act 2023. At that point, the user becomes a Data Principal in their own right. Best practice is to obtain fresh consent directly from the user (now an adult) and transition their account from parental-consent-based processing to direct consent. The parent's consent does not automatically convert to the adult user's consent.
How should EdTech platforms handle data from children if they also serve adult learners?
Platforms serving both adult and minor users must implement differentiated data processing. This means age-gating at registration, separate consent flows for users identified as children, and distinct data handling policies for children's data (no tracking, no behavioral monitoring, no targeted advertising). Many platforms find it operationally simpler to apply the stricter children's data standards across all users, avoiding the risk of misclassification.
Are there any exemptions for educational institutions under Section 9?
As of February 2026, no exemptions under Section 9(4) have been notified by the Central Government. Section 9(5) provides a potential pathway for exemption if processing is "verifiably safe," but the criteria and application process have not been published. Educational institutions should plan for full compliance with Section 9 until formal exemptions are issued.
Start Building Your EdTech Compliance Stack
This article is for informational purposes and reflects the DPDP Act 2023 and DPDP Rules 2025 as understood at the time of writing. For guidance specific to your business, we recommend consulting a qualified data protection professional.
If your EdTech platform serves users under 18, the compliance clock is already running. ComplyZero's self-serve platform helps EdTech companies implement parental consent flows, maintain audit-ready records, and manage student data rights, all designed for the specific requirements of Section 9. Set up takes 15 minutes, not 15 weeks.
Simplify Your DPDP Compliance
This article is for informational purposes and reflects the DPDP Act 2023 and DPDP Rules 2025 as understood at the time of writing. For guidance specific to your business, we recommend consulting a qualified data protection professional.
ComplyZero handles the complexity for you: consent management, privacy notices in 22 languages, DSR workflows, and audit-ready compliance records. Get your business DPDP-ready in minutes, not months.
Get Started Free