← Blog
In Force NowApril 13, 2026 · 8 min read

EU AI Act — What Small Businesses Actually Need to Do

The EU AI Act is the world's first comprehensive AI law. If you use AI in your business — even just ChatGPT for emails — this law may affect you. Here is a plain-English guide to what it says, what the fines are, and exactly what you need to do before the deadlines.

Key deadline: August 2, 2026

The main provisions of the EU AI Act apply from this date. General-purpose AI model obligations (like rules for ChatGPT providers) took effect August 2, 2025. Prohibited practices have been enforceable since February 2, 2025.

Does the EU AI Act apply to my business?

The short answer: if you are a small business and you use AI tools (as opposed to building them), the EU AI Act's requirements for you are relatively light — but not zero.

Here is how the law divides responsibility:

P

Provider

Companies that build AI systems (OpenAI, Google, Anthropic, your software vendor). They carry the heaviest compliance burden.

D

Deployer

Businesses that use AI tools in their operations — to make decisions that affect customers, employees, or clients. This is likely you.

U

User

Individuals using AI for personal, non-professional purposes. Minimal obligations.

If you are a small business that uses AI tools to help make decisions — hiring candidates, approving loans, triaging customer complaints, generating content — you are a deployerand the EU AI Act applies to you.

What is "high-risk" AI? And am I using it?

The EU AI Act uses a risk-based pyramid. Most obligations fall on "high-risk" AI uses. High-risk does NOT mean "dangerous AI" — it means AI used in specific sensitive contexts.

High-Risk AI Uses (strict obligations)

  • Hiring and recruitment (CV screening, interview analysis, employee scoring)
  • Credit and loan decisions
  • Insurance risk assessment
  • Healthcare diagnostics or treatment decisions
  • Education: scoring, grading, tracking students
  • Law enforcement applications
  • Border control and migration processing
  • Access to essential public services

Limited-Risk AI (transparency obligations only)

  • Chatbots and AI assistants interacting with customers
  • AI-generated content (text, images, audio, video)
  • Deepfakes and synthetic media

Minimal Risk (no specific obligations)

  • Using ChatGPT to draft emails or summarize documents
  • AI spelling checkers and grammar tools
  • Recommendation algorithms on your website
  • Spam filters

What are the fines?

The EU AI Act fines are calculated as a percentage of your company's global annual turnover — not just EU revenue.

Prohibited AI practices

Up to €35 million or 7% of global revenue

High-risk AI violations

Up to €15 million or 3% of global revenue

Minor compliance failures

Up to €7.5 million or 1.5% of global revenue

For small businesses, regulators are expected to apply lower fines proportionately — but the law gives no explicit exemptions for company size.

The 5 things small businesses actually need to do

1

Know which AI tools your team uses

You cannot comply if you do not know what you are complying with. Create an inventory of every AI tool used in your business — ChatGPT, Copilot, Gemini, AI in your CRM, AI in your HR software, etc. Include the vendor, use case, and which employees use it.

2

Classify the risk level of each tool

For each tool, ask: "Could this AI system affect someone's access to employment, credit, healthcare, or education?" If yes, it is likely high-risk. If you use it only for drafting text or internal productivity, it is minimal risk.

3

Write an AI policy (high-risk deployers)

If you use high-risk AI, you need written documentation: what the AI does, how you use it, what human oversight exists, how you handle errors, and how you manage the data used. This does not need to be complex — a 2-page document often suffices for small businesses.

4

Add AI disclosures to your chatbot

If you have any AI chatbot on your website or in your customer communications, you must disclose that customers are interacting with AI. This is a "limited risk" obligation that applies to all businesses — even if your AI use is otherwise minimal.

5

Label AI-generated content

Synthetic media — AI-generated images, videos, audio — must be labeled as AI-generated when published. If you use tools like AI image generators for marketing, add clear labeling.

What are the key deadlines?

DateWhat becomes enforceable
✓ PastFeb 2, 2025Prohibited AI practices banned (no exceptions)
✓ PastAug 2, 2025Rules for general-purpose AI model providers
Aug 2, 2026Full enforcement: high-risk AI, transparency rules, deployer obligations
Aug 2, 2027Medical device AI (MDR/IVDR)

The honest bottom line for small businesses

If you only use AI for productivity (drafting, summarizing, spell-checking), the EU AI Act places very few obligations on you. The main one: disclose AI chatbots to customers.

If you use AI to make decisions that affect people (hiring, credit, health), you need to treat this seriously: document your AI use, ensure human oversight, and understand your vendor's compliance obligations.

The worst thing you can do is ignore this entirely. Regulators will start with the biggest offenders, but EU national authorities have significant budgets and political pressure to enforce these rules. Start now while the requirements are still manageable.

Legal disclaimer

This article is for informational purposes only and does not constitute legal advice. EU AI Act compliance requirements vary by your specific situation, industry, and country. Consult a qualified legal professional for advice specific to your business.

Not sure which laws apply to your business?

ComplianceIQ scans your AI tools, checks 155+ jurisdictions, and gives you a personalized compliance roadmap — in minutes, not months.