AI Supply Chain Compliance: Due Diligence for AI Vendors and Partners
Every AI system you use from a vendor makes you an EU AI Act deployer — with your own compliance obligations regardless of the vendor's compliance status. AI supply chain compliance is the process of verifying that vendors meet their obligations, ensuring your contracts protect you, and monitoring performance after deployment.
How the EU AI Act Allocates Supply Chain Responsibility
The EU AI Act establishes a shared responsibility model across the AI supply chain. Unlike traditional product liability where responsibility concentrates at the manufacturer, AI Act liability is split based on your role:
Provider (also: developer, vendor, manufacturer)
The company that develops, trains, and makes the AI system available. Responsible for conformity assessment, technical documentation, and CE marking for high-risk AI.
Liability: Primary liability for intrinsic AI system failures, design defects, and pre-deployment bias.
Examples: OpenAI (provides API), Salesforce (provides AI CRM), Microsoft (provides Copilot)
Deployer
The company that uses an AI system in their own products or business operations for a purpose other than personal use. Becomes a deployer when you use a third-party AI system in your operations.
Liability: Liability for deployment context, human oversight failures, and use outside the provider's intended scope.
Examples: A bank using an AI credit-scoring API; a recruiter using an AI CV-screening tool; an insurer using an AI underwriting system
Importer
Applicable to physical AI products (embedded AI systems, AI robots). Places a third-country provider's AI product on the EU market.
Liability: Importer must verify the provider complied with EU AI Act requirements before importing.
Examples: EU company distributing AI-embedded medical devices from a US manufacturer
Distributor
Makes an AI system available on the EU market without modifying it. Less common in software contexts.
Liability: Must verify the system has required CE marking and documentation before distribution.
Examples: SaaS resellers; cloud marketplace providers who list third-party AI tools
Key insight: "Deployer" is not a passive role
Many businesses assume that using a third-party AI system means the vendor takes all compliance responsibility. This is incorrect. The EU AI Act Articles 26 and 29 impose explicit obligations on deployers — including human oversight, monitoring, incident reporting, and ensuring use is within the intended scope.
AI Vendor Due Diligence by Risk Tier
Tier 1 — High-risk AI (EU AI Act Annex III)
EU AI Act Declaration of Conformity — has the provider issued one? Request a copy.
CE marking and EU database registration (from August 2026)
Technical documentation summary — provider must share relevant sections with deployers
Bias testing results and methodology — before you can deploy, you need the test results
Human oversight mechanism — how does the system support your oversight obligations?
Post-market monitoring plan — provider's plan and what they report to deployers
Instruction for use — does it clearly state intended use, limitations, and accuracy metrics?
GDPR Data Processing Agreement — mandatory if personal data is processed
Tier 2 — AI systems with GDPR personal data processing
GDPR Data Processing Agreement (Article 28) — mandatory, not optional
Sub-processor list — who else does the vendor share your data with?
Data residency — where is data processed? EU-based for Schrems II compliance?
International transfer mechanism — SCCs or equivalent if data leaves EEA
Retention and deletion policy — when does the vendor delete your data?
Incident notification commitment — 72-hour notification for data breaches
Audit rights — can you audit or request audit reports (e.g., SOC 2, ISO 27001)?
Tier 3 — General AI tools (lower risk)
Data use policy — does the vendor use your data to train their model?
Security certifications — SOC 2 Type II or ISO 27001 as minimum
Data deletion on termination — confirmed in writing
Privacy notice and terms of service — reviewed and acceptable
Essential Contract Clauses for AI Vendor Agreements
Standard AI vendor MSAs often lack the protections EU AI Act and GDPR compliance requires. These clauses must be negotiated and included:
No training on Customer Content
RequiredProvider shall not use any Customer Content, including but not limited to inputs to the AI system and outputs generated using Customer Content, for the purpose of training, fine-tuning, or improving any AI model, including models made available to customers other than Customer, without Customer's prior written consent.
EU AI Act compliance warranty
RequiredProvider warrants that the AI system, as of the effective date and throughout the term, complies with all applicable requirements under Regulation (EU) 2024/1689 (EU AI Act) for the intended use described in the Statement of Work. Provider shall provide Customer with all information necessary for Customer to fulfil its obligations as a deployer under the EU AI Act.
IP ownership of outputs
RequiredAll outputs generated by the AI system using Customer Content are assigned to Customer, including all intellectual property rights therein, to the extent permissible by applicable law. Provider retains no rights in Customer-specific outputs.
Incident notification
RequiredProvider shall notify Customer of any security incident affecting Customer Content or any incident affecting the performance or availability of the AI system within 24 hours of becoming aware of such incident, and shall provide a detailed report within 72 hours.
Bias and performance monitoring
Provider shall conduct bias testing of the AI system no less frequently than annually and shall share the results of such testing with Customer. Provider shall notify Customer promptly if monitoring reveals a material change in the accuracy, fairness, or performance characteristics of the AI system.
Right to audit
Customer shall have the right, upon 30 days' written notice, to audit Provider's compliance with this agreement and applicable law, or to commission an independent third party to conduct such audit at Customer's expense. Provider shall cooperate fully with any such audit.
Ongoing Vendor Monitoring: What to Check After Deployment
AI vendor due diligence is not a one-time exercise at procurement. The EU AI Act requires ongoing monitoring, and vendor AI systems can change without notice:
AI system availability and error rates. Anomalies that may indicate model updates, bias drift, or security incidents.
Review vendor communication for model updates. Check vendor security bulletin. Verify DPA and contractual terms still applicable to current usage.
Full vendor risk review. Request updated bias audit results. Verify EU AI Act compliance status for high-risk AI. Re-confirm data use restrictions are being honoured. Check vendor SOC 2 / ISO 27001 report.
Vendor acquisition or change of ownership. Material change to privacy policy or terms of service. Security incident at vendor. New regulatory guidance affecting the AI use case. Significant model update or version change.
The AI Vendor Lock-In Compliance Risk
Vendor lock-in is both a commercial risk and a compliance risk. If you become so dependent on a single AI vendor that you cannot exit without major disruption:
You lose negotiating power on contract renewals — the vendor can add unfavourable terms and you cannot leave
GDPR Article 28(3)(e) requires data portability: can you export all data if you terminate the vendor relationship?
If the vendor is acquired and changes its data use policy, you may face a compliance emergency with no exit path
EU AI Act compliance depends on the vendor continuing to provide required documentation and updates — this is not guaranteed
Risk diversification: using a single AI vendor for multiple high-risk use cases concentrates AI supply chain risk
Mitigation: maintain a tested data export process for each AI vendor; include exit assistance and data handover obligations in every AI vendor contract; test the exit process before you need it.
Track Your AI Vendor Compliance
ComplianceIQ's vendor risk module helps you track due diligence status, contract review dates, and monitoring schedules across your entire AI vendor portfolio.
Run a Free Risk Assessment