AI Compliance for Education: FERPA, EU AI Act, and Student Data Requirements
Education is one of the highest-risk sectors under the EU AI Act — AI used in admissions, assessments, and student placement is explicitly classified as high-risk under Annex III. Schools, universities, and EdTech companies must navigate FERPA, COPPA, EU AI Act, and an expanding set of state student privacy laws simultaneously.
Why AI in Education Is High-Stakes for Compliance
Three converging factors make education a compliance priority:
Minors are the primary subjects
Children and young adults have heightened data protection rights under every jurisdiction. COPPA (under-13s), GDPR (special protections for minors), and state laws all impose stricter requirements when AI processes student data.
Consequential decisions with lifelong impact
AI systems in education often make or influence decisions about admissions, grade progression, academic support, and discipline — decisions that affect students' life trajectories. Regulators treat these as high-consequence AI use cases.
Legacy data sharing practices
Education has historically shared data widely with EdTech vendors under broad FERPA exceptions. AI changes this risk calculus: vendors can now infer far more from data than FERPA's drafters anticipated in 1974.
EU AI Act Risk Classification for Educational AI
The EU AI Act Annex III para.3 explicitly lists "AI systems intended to be used for the purpose of determining access to or assigning persons to educational and vocational training institutions, or to evaluate learning outcomes" as high-risk.
AI for student admissions or placement decisions
High-Risk (Annex III)Why: Access to education and educational benefits — explicitly listed in Annex III para.3(b)
Required: Conformity assessment, technical documentation, human oversight, bias testing
AI that assesses or grades student work
High-Risk (Annex III)Why: Evaluation of learning outcomes determines educational opportunities
Required: Full high-risk AI requirements including explainability and audit logs
AI that detects student emotional state or engagement
High-Risk (Annex III)Why: Biometric-adjacent; infers psychological state; affects minors
Required: Strict data minimisation, parental consent, DPA required
Adaptive learning platforms that personalise curriculum
Limited Risk or High-RiskWhy: May be high-risk if it shapes educational trajectory; limited risk if purely presentational
Required: Risk assessment required to determine applicable tier
AI chatbots for student support (mental health)
High-Risk (contested)Why: Mental health support may qualify as AI in healthcare under Annex III para.5
Required: Consult legal counsel; default to high-risk requirements as precaution
AI-generated content tools for student use (writing assistants)
Limited RiskWhy: No consequential individual decision-making
Required: Transparency: disclose AI use to students and staff
August 2, 2026 deadline
High-risk AI systems must meet full EU AI Act requirements by August 2, 2026. Schools and EdTech companies operating in the EU should begin conformity assessments and technical documentation immediately if they have not already.
FERPA and AI: Five Rules Every Institution Needs to Know
The Family Educational Rights and Privacy Act (FERPA) was written in 1974 — long before AI. The US Department of Education has not issued AI-specific FERPA guidance as of 2026. However, the existing rules apply clearly to most AI use cases in education:
Education records are protected
Any record directly related to a student and maintained by an educational institution is an "education record" covered by FERPA. This includes AI-generated assessments, performance predictions, and behavioural flags.
Third-party AI vendors need written agreement
Schools can share education records with AI vendors if they qualify as a "school official" with a "legitimate educational interest" — but only under a written agreement that restricts how the vendor uses the data.
Training AI on student records without consent is likely a violation
Using student education records to train an AI model — even for improving the educational service — is not clearly covered under the school-official exception. FERPA consent may be required.
Parental rights until age 18
For K-12 students, FERPA rights belong to parents. Any AI system that processes student education records must accommodate parental access, correction, and consent rights.
State laws may be stricter
SOPIPA (California), New York Education Law §2-d, Colorado HB 20-1468, and similar laws impose additional restrictions on EdTech vendors beyond FERPA. Always check state law.
COPPA and Student Privacy Checklist for AI Systems
COPPA applies to AI systems that collect personal information from children under 13. For schools using AI, the school-consent exception allows teachers and school officials to consent on behalf of parents — but only for services used for educational purposes, not for commercial AI products.
No AI system collects personal data from children under 13 without verifiable parental consent
No AI system uses student data for targeted advertising or commercial profiling
Third-party AI vendors have agreed not to use student data beyond providing the contracted service
Data minimisation: AI systems collect only data necessary for the educational purpose
Parents and students (13+) can access, correct, and delete AI-generated records about them
All AI vendors who access student data are listed in the school's annual FERPA notification
Key US State Student Privacy Laws Affecting AI
SOPIPA (California)
Prohibits EdTech vendors from using student data for targeted advertising, building behavioural profiles for non-educational purposes, or selling student data. Covers operators of school-directed services — including AI tools.
New York Education Law §2-d
Applies to all "third-party contractors" handling student data. Requires Parents' Bill of Rights notice, strict data security requirements, and limits use to the contracted educational purpose. One of the strictest state-level protections.
Colorado AI Act (SB 24-205)
Effective June 30, 2026. If an AI system is used in student placement or educational benefit decisions, Colorado's deployer obligations apply — including impact assessments and student notification rights.
Illinois AIVIA and BIPA
Biometric Information Privacy Act applies to AI systems that analyse facial geometry, voiceprints, or other biometrics — including emotion-detection or engagement-monitoring AI in classrooms.
For EdTech Companies: What You Must Build In
Data Processing Agreement template that meets both FERPA school-official exception and GDPR Art.28 requirements
Data use limitation: contractual and technical controls preventing use of student data beyond the contracted service
Model training disclosure: clear statement in DPA of whether student data is used for model training, and if so, how
Parental access portal: mechanism for parents to access, correct, and delete AI-generated records about their child
Annual audit rights: allow school clients to audit compliance with data use restrictions
EU AI Act conformity documentation for high-risk AI products — required for EU school customers
Human oversight mechanism: high-risk AI decisions must be reviewable by a teacher or counsellor before taking effect
Map Your AI Compliance Obligations
ComplianceIQ automatically identifies which regulations apply to your AI systems based on your sector, jurisdiction, and use case — including FERPA, EU AI Act, and COPPA.
Run a Free Risk Assessment