← Blog
EU AI Act Article 4 April 17, 2026 · 10 min read

AI Literacy Training: What Regulations Require and How to Build a Programme

EU AI Act Article 4 is not a suggestion. From August 2, 2026, organisations deploying AI systems in the EU must ensure "sufficient AI literacy" for all relevant staff. Here is what that means in practice.

What EU AI Act Article 4 Actually Says

"Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context the AI systems are to be used in, and considering the persons or groups of persons on which the AI systems are to be used."

— EU AI Act, Article 4

Four things to note about this obligation:

  • It applies to both providers (developers) and deployers (companies using AI) — not just the technology vendors.
  • "Sufficient" is context-dependent — a receptionist using a chatbot needs less training than a radiologist relying on AI-assisted diagnosis.
  • It covers "other persons" working on your behalf — meaning contractors and third-party operators are within scope.
  • It has applied since February 2, 2025 — the same date as the prohibited practices provisions. This is not a future obligation.

GDPR Training Requirements That Also Apply

GDPR does not have a dedicated "AI training" article, but several obligations create de facto training requirements for AI systems processing personal data:

Article 5(1)(f)

Integrity and confidentiality — staff must understand what data can be entered into AI tools.

Article 24

Controller accountability — demonstrating compliance requires evidence that staff are trained.

Article 32

Security of processing — staff must understand risks when entering personal data into AI systems.

Article 35 + Recital 84

DPIA — staff conducting DPIAs for AI systems must understand AI risk assessment methodology.

Role-Based Training Requirements

A single company-wide AI training module does not satisfy Article 4. The law requires training calibrated to each person's role and the AI systems they use. Here is a practical role-based matrix:

All staff using AI tools

1–2 hoursAnnual + on onboarding

Must cover:

  • What AI systems they are authorised to use and for what tasks
  • What data may and may not be entered into AI tools
  • When to escalate AI outputs for human review
  • How to report AI errors or unexpected behaviour

AI system operators and deployers

4–8 hoursBefore deployment + annual refresh

Must cover:

  • EU AI Act obligations for their specific system (Article 29)
  • Human oversight responsibilities and when to override AI
  • Incident reporting obligations (Article 73)
  • Record-keeping requirements

AI developers and data scientists

8–16 hoursQuarterly updates + project-level briefings

Must cover:

  • Technical compliance requirements for high-risk AI (Articles 9–15)
  • Bias testing methodology and documentation
  • Data governance requirements (Article 10)
  • Logging and traceability obligations (Article 12)

Senior leadership and board

2–3 hoursAnnual

Must cover:

  • Organisational liability under EU AI Act and GDPR
  • Board-level AI governance responsibilities
  • Risk appetite and escalation framework
  • Regulatory landscape overview

What Counts as "AI Literacy" Training?

The EU AI Act does not prescribe specific training formats. Acceptable formats include:

E-learning modules

Asynchronous, scalable — ideal for all-staff requirements. Must be role-specific, not generic.

Live workshops

Better for operators and developers. Allows Q&A and scenario-based exercises.

Policy attestation

Employees certify they have read and understood the AI use policy. Lowest bar — best paired with other training.

Scenario-based exercises

Employees work through realistic AI decision scenarios. High retention, audit-friendly.

On-the-job training

Supervision during initial AI system use. Documented by supervisor. Valid for operators.

External certification

CIPP/E, AI governance certifications from IAPP or similar. Strong evidence for senior roles.

What Regulators Will Want to See in an Audit

When a market surveillance authority examines your AI literacy compliance, these are the five evidence items they will ask for:

  • Training completion records by employee and date
  • Training content version history (showing it reflects current law)
  • Assessment scores or attestation records
  • Role-based training matrix showing who received what
  • Named responsible owner for the training programme

Common gap

Most companies have training records for data protection but cannot demonstrate the training was AI-specific or covered the AI systems actually in use. Generic "AI awareness" content created before your AI inventory was completed will not satisfy an Article 4 review.

Building Your Programme: 8-Week Plan

Weeks 1–2

Complete AI system inventory. Map which employees interact with which AI systems. Define your training tiers (minimum 3: general staff / operators / developers).

Weeks 3–4

Build or procure training content for each tier. General staff module: 45–60 minutes. Operator module: half-day. Developer module: full day.

Week 5

Pilot with a small group. Gather feedback. Update content.

Weeks 6–7

Roll out company-wide. Track completions. Capture attestation records.

Week 8

Review completion rate. Identify gaps. Schedule first annual refresh date. Document programme owner and review cadence.

Track your AI literacy compliance

ComplianceIQ tracks Article 4 training requirements, maps them to your AI system inventory, and generates the audit evidence regulators expect.

Start free