AI Contracts and IP Ownership: What Every Business Needs to Know
AI vendor contracts are full of clauses that transfer your IP, allow training on your data, and limit your liability protections. AI-generated content may not qualify for copyright protection at all. Here is how to protect your business in a world where AI produces work product but existing IP law was not written for it.
Who Owns AI-Generated Content? The Copyright Problem
Copyright law across most jurisdictions requires human authorship. An AI system cannot be a copyright author. This creates a genuine legal gap:
United States
AI-only output is not copyrightableThe US Copyright Office has consistently refused registration for AI-only outputs. If a human makes sufficient creative choices in prompting or editing, those human contributions may be copyrightable — but the AI's contribution is in the public domain.
European Union
Originality requires human intellectual creationCJEU case law defines copyright as protecting "original intellectual creation" — which implies human creative choices. Purely AI-generated content likely does not meet this threshold. EU AI Act does not address copyright ownership directly.
United Kingdom
Computer-generated works: author is person who made necessary arrangementsCDPA s.178 provides a unique protection for computer-generated works, authoring it to "the person by whom the arrangements necessary for the creation of the work are undertaken." This may provide thin copyright protection for AI outputs in the UK.
China
No copyright for AI-generated content without human authorshipChinese courts have found AI-generated content is not copyrightable. The operator of the AI tool may have neighbouring rights in some circumstances, but full copyright requires human authorship.
Practical implication
If your business produces AI-generated content — marketing copy, code, designs, reports — that content is likely not fully protectable by copyright. Competitors can legally copy it. Disclose AI use in client contracts so clients are not surprised by this limitation.
IP Ownership by Scenario
Your employee uses ChatGPT to draft a contract
High riskOwnership: Unclear / contested
OpenAI's terms assign output to the user — but US copyright law does not recognise AI authorship. The output is likely in the public domain in the US. Other jurisdictions vary. Do not rely on AI-drafted contracts as confidential IP.
Your employee uses GitHub Copilot to write production code
Medium riskOwnership: Your company (with conditions)
GitHub Copilot enterprise terms include an IP indemnification clause — if code is flagged as matching training data, GitHub defends you. However, this indemnification is conditional on using the product as intended and not customising the model.
You use an AI API to generate marketing copy for clients
High riskOwnership: Client receives output; copyright unclear
If you are an agency using AI to generate client deliverables, your contract with the client may promise work product as their IP — but AI-generated content may not qualify for copyright protection. Disclose AI use in your engagement agreement.
You fine-tune an open-source AI model on your proprietary data
Medium riskOwnership: Model weights: likely yours; base model subject to licence
Fine-tuned weights derived from open-source models are subject to the base model's licence. Llama 2 and 3 have restrictions on commercial use above 700M MAU. Mistral is Apache 2.0 (permissive). Always check the base model licence before commercialising fine-tuned models.
An AI vendor trains on your customer data without a clear contract
Critical riskOwnership: Likely the vendor
If you have not explicitly prohibited the vendor from using your data for model training, their default terms may permit it. This means your proprietary business data, customer interactions, and trade secrets could end up in a shared model. Demand a data use prohibition clause.
5 Dangerous AI Contract Clauses to Watch For
“"We may use your content to improve our services"”
Why it matters: This is the training data clause in disguise. "Improving services" includes training AI models on your inputs. If you accept this clause, your proprietary data — client conversations, internal documents, trade secrets — may be used to train the vendor's model.
Negotiation fix
Negotiate: "Provider shall not use Customer Content to train, fine-tune, or improve any AI model, including models made available to other customers."
“"Provider grants you a limited, non-exclusive, non-transferable licence to use outputs"”
Why it matters: This clause means you do not own the AI outputs — you licence them. The licence may be revoked, limited to specific uses, or expire with your subscription. You cannot sublicence outputs to clients.
Negotiation fix
Negotiate: "All outputs generated using Customer Content are assigned to Customer, including all IP rights therein."
“"Provider reserves the right to modify, discontinue, or restrict the service"”
Why it matters: For AI tools integrated into core workflows, a unilateral right to discontinue means your business processes are at risk. Combined with the output-licence clause, discontinued service = lost access to work product.
Negotiation fix
Add: "Provider shall give 90 days' notice before material changes to functionality. Customer may terminate without penalty if changes materially reduce utility."
“"Outputs may not be accurate; Customer assumes all risk"”
Why it matters: Standard in AI terms, but the scope matters. If your use case requires accurate outputs — legal research, medical information, financial calculations — this clause means you bear full liability for errors. Regulators may not accept "the AI said so" as a defence.
Negotiation fix
Document your human review process. Maintain audit logs showing that AI outputs are reviewed before consequential use. Retain liability for negligent or foreseeable errors in your vendor agreement.
“"By using our service, you represent you have all rights in submitted content"”
Why it matters: This representation becomes problematic if employees paste third-party content, confidential information, or personal data into the AI tool. A breach of this representation could void the vendor's IP indemnification.
Negotiation fix
Build employee training that restricts what types of content may be submitted to AI tools. Document in your AI Acceptable Use Policy.
AI Vendor Contract Review Checklist
Use this checklist when reviewing any AI vendor agreement. Items marked "critical" should block signature if absent:
Employee IP Assignments in the AI Era
Most employment agreements include a clause assigning employee-created work product to the employer. In the AI era, these clauses need updating:
The assignment clause should cover AI-assisted work product — work product created using AI tools in the course of employment
Employees should not have personal copyright claims to AI-assisted work done for the employer
Non-disclosure and confidentiality obligations should explicitly cover AI tool inputs — employees cannot share confidential information with AI vendors
The acceptable use policy for AI tools should be referenced or incorporated into employment agreements
For contractors and freelancers: update standard SoW to specify that AI-assisted deliverables are owned by the client, not the contractor
Track Your AI Vendor Contracts
ComplianceIQ's vendor risk module helps you track AI vendor contracts, flag renewal dates, and document data use restrictions across your vendor portfolio.
Run a Free Risk Assessment