← creativecyber.in/Regulatory Insights/DPDP Knowledge Hub/Resources & Checklists
PRACTITIONER FAQ

EdTech + Schools: Processing Children's Data Under §9 DPDP and the Parental Consent Architecture

10 min read|EdTech DPO · School IT Administrator · Education Compliance · Product Manager|April 2026
Share this article
PRACTITIONER FAQ · EPISODE 05 OF 10EdTech · Schools · Minor Data Subjects

EdTech + Schools: Processing Children's Data Under §9 DPDP and the Parental Consent Architecture

Students aged 6–18. Learning analytics, AI risk scoring, product improvement training data. The admission form clause won't hold. Here's what the DPDP §9 parental consent framework actually requires.

📚
THE SCENARIO

An EdTech company deploys an LMS to 1,200 schools. Students aged 6–18 use the platform. The EdTech collects learning analytics, quiz scores, and behavioural inference (predicted learning style, risk-of-dropout score). It uses anonymised student data to improve its AI recommendation engine. The school's annual admission form included a clause about "educational technology services."

Q 5.1

Who must obtain parental consent — the school or the EdTech?

PARENTAL CONSENT RESPONSIBILITY — §9 DPDP ACTSchool (Primary Fiduciary)Obtains parental consent for:✓ Enrollment on EdTech platform✓ Learning activity dataEdTech (Co-Fiduciary if)Needs own consent for:✗ AI product training✗ Own analytics products§9 Prohibitions• Behavioural monitoring• Targeted advertising• Individual trackingAdmission Form Clause = INSUFFICIENT Parental ConsentA generic clause ("we use educational technology") is NOT specific, NOT informed, and does NOT name the EdTech or describe data use.Required: Named EdTech, specific data categories, purpose, retention period — affirmative parental consent at each academic year.For minors: k-anonymity minimum k=50 for any training datasets; no individual linkage in AI training pipelines.

The school is the primary Fiduciary. But when the EdTech independently determines how to use analytics data to improve its own product, it becomes a co-Fiduciary for that purpose — with its own §9 parental consent obligation. The admission form clause does not satisfy this requirement.

Q 5.2

Can the EdTech use anonymised student data to train its AI recommendation engine?

● CONDITIONALLY PERMITTED — HIGH SCRUTINY

§9(2) prohibits behavioural monitoring of minors. Using individual student learning patterns to train a recommendation model constitutes behavioural profiling. The safest implementation: aggregate at cohort level (Grade 7 students with pattern X), prove k-anonymity at minimum k=50, document the methodology in ROPA.

Q 5.3

A parent withdraws consent. Must the EdTech delete the student's entire learning history?

EdTech's own analytics/product data: must delete. Academic records (assessment scores, attendance) that form part of the student's academic history: the school has a legitimate basis to retain under educational regulations. The EdTech should export this data to the school's own systems and delete it from its servers, or maintain it in a school-controlled partition where the school is Fiduciary and EdTech is Processor with no independent access.

Build your §9 compliant consent architecture with platform support.

Talk to a specialist →
Share this article

Get DPDP compliance insights in your inbox

Practical guides for CISOs, DPOs, and compliance teams — no spam, unsubscribe anytime.

Ready to implement what you've read?

The CreativeCyber DPDP Assurance Platform puts every framework, workflow, and control referenced in this article into a single audit-ready platform — built specifically for BFSI.

Book a Live Demo →