Contract Addendum Template: Limiting Use of Patient Documents by Third‑Party AI Providers
legalcontractsprivacy

Contract Addendum Template: Limiting Use of Patient Documents by Third‑Party AI Providers

JJordan Ellis
2026-04-14
19 min read
Advertisement

A reusable addendum to block patient document training, sale, and advertising use by third-party AI vendors.

Contract Addendum Template: Limiting Use of Patient Documents by Third‑Party AI Providers

Small practices are being asked to move faster than ever: scan intake packets, summarize records, route referrals, and sign agreements without adding staff. That pressure is exactly why vendors are pushing AI features into document workflows. But once patient documents enter a third-party AI system, the legal and operational questions get serious fast: Is the data used for model training? Is it retained? Can the provider repurpose it for advertising, analytics, or product development? This guide gives you a reusable contract addendum framework that small clinics can attach to a vendor agreement, alongside a privacy-forward data processing posture, to limit use of patient documents by AI providers.

Health data is not ordinary business data. The recent rollout of consumer-facing health features that analyze medical records shows how quickly sensitive information can become part of a broader AI product strategy, and why airtight safeguards matter. As reported by BBC, OpenAI said its health chats would be stored separately and not used for training, while privacy campaigners warned that health data requires “airtight” controls. For small practices, this means you should not rely on vendor marketing claims alone; you need contract language that restricts use, retention, subcontracting, and downstream monetization. For broader workflow context, see our guides on digital signatures and online docs for care teams and health-system analytics governance.

Why this addendum matters now

Patient documents are high-risk inputs for AI systems

Patient PDFs, scans, images, forms, and chart exports usually contain names, dates of birth, insurance IDs, diagnoses, medications, and billing details. Even if a vendor claims the data is de-identified later, the original upload process often exposes raw information to logging, support, debugging, or enrichment services. In practice, the biggest risk is not just a breach; it is silent secondary use. That is why a narrow data-use clause is as important as access control, encryption, or a signed BAA-style security review.

AI vendors often bundle useful and risky uses together

Many vendors want permission to “improve services,” “analyze usage,” or “develop new features.” Those phrases can hide broad rights to train models, fine-tune shared systems, or profile users for product and advertising. If your practice is sending patient documents for summarization, OCR, classification, or routing, that broad language can conflict with your privacy obligations and your internal trust model. A good AI governance process should draw a hard line between operational processing and reuse for model development.

Regulatory pressure is increasing, not easing

Even small clinics are being pulled into more formal expectations around privacy, subcontractors, and minimum necessary use. Whether you are operating under HIPAA, state privacy rules, payer requirements, or contractual confidentiality obligations, the safer approach is to document that patient documents are processed only to provide the contracted service. That means no training, no sale, no ad targeting, no cross-client profiling, and no combining patient data with other customer datasets. For a practical view of how sensitive data can reshape product strategy, compare this issue with content creation in the age of AI and auditing LLM outputs in high-stakes workflows.

What the contract addendum should accomplish

Separate service delivery from model development

The core purpose of the addendum is to define a narrow permitted use: the vendor may process patient documents solely to deliver the contracted service, such as OCR, indexing, routing, redaction, extraction, or clinician-facing summaries. Everything beyond that should be prohibited unless the clinic gives explicit, written, itemized consent. This is the single most important line in the document because it prevents the vendor from defaulting to broad AI rights buried in its master agreement or online terms. If you’re comparing vendors, a useful lens is the same one used in multi-provider AI architecture: keep each use case isolated and contractual boundaries clear.

Block secondary monetization

Your addendum should say no sale, no rental, no licensing, no advertising use, and no behavioral profiling. In other words, the vendor cannot transform patient records into a product asset. This matters because the economics of AI are pushing many providers to monetize data directly or indirectly, especially when they are building advertising-adjacent ecosystems. The BBC report on health-focused AI is a reminder that firms are exploring business models beyond subscriptions, which makes contractual restrictions essential. If you want a broader pattern for managing monetization risk, our article on privacy as a differentiator shows how restrictions can become a trust signal.

Preserve deletion, retention, and audit rights

Most small practices underestimate the importance of retention language. A vendor may agree not to train on your data, but still keep it indefinitely in logs, backups, support tools, or review queues. Your addendum should require deletion on termination or on request, define backup expiry windows, and reserve audit or attestation rights for compliance verification. When a vendor is handling patient documents, “we don’t use it for training” is not enough unless retention and subprocessors are also controlled. For adjacent document workflows, see our guide to reducing admin time with digital signatures and our resource on third-party signing risk frameworks.

Reusable contract addendum template

The template below is designed for a small practice to attach to a vendor agreement, order form, or statement of work. It is intentionally written in plain English, but the clauses are structured to support legal review. You should have counsel adapt it for your state, your BAA obligations, and the specific service involved. Do not rely on a vendor’s standard DPA if it does not expressly cover patient documents, AI model training prohibition, and advertising restrictions. If you are digitizing records first, our article on data flow in AI-enabled layouts is a useful operations reference.

Pro Tip: The strongest contracts define both what the vendor can do and what it must never do. If a clause only says “we will protect your data,” that is a promise, not a processing boundary.

Template text

Contract Addendum: Use Restrictions for Patient Documents

This Addendum is incorporated into and made part of the Agreement between Customer and Vendor. If there is any conflict between this Addendum and the Agreement, this Addendum controls with respect to Patient Documents and Derived Data.

1. Definitions. “Patient Documents” means any records, forms, scans, PDFs, images, metadata, transcriptions, extracts, or other content containing or derived from patient information, protected health information, medical record information, billing information, insurance information, or appointment-related data submitted by or on behalf of Customer. “Derived Data” means any output, feature, label, embedding, summary, vector, statistic, inference, or derivative created from Patient Documents.

2. Permitted Use. Vendor may Process Patient Documents and Derived Data solely as necessary to provide the services described in the Agreement to Customer, including storage, transmission, OCR, classification, retrieval, redaction, indexing, workflow automation, and customer-directed output generation.

3. Prohibited Uses. Vendor shall not, and shall not permit any Affiliate or Subprocessor to: (a) sell, rent, disclose, license, or otherwise make available Patient Documents or Derived Data for any purpose unrelated to providing the services; (b) use Patient Documents or Derived Data to train, fine-tune, evaluate, or improve any machine learning, large language model, foundation model, or similar system; (c) use Patient Documents or Derived Data for advertising, marketing, profiling, audience segmentation, or cross-context behavioral analysis; (d) combine Patient Documents or Derived Data with data from other customers except to provide the services to Customer; or (e) use Patient Documents or Derived Data for any internal product development not expressly approved in writing by Customer.

4. No De-Identification Exception Unless Approved. Vendor may not claim that de-identification, pseudonymization, aggregation, or tokenization permits any use prohibited by this Addendum unless Customer has expressly approved such use in writing and applicable law permits it.

5. Retention Limits. Vendor shall retain Patient Documents only for the minimum period necessary to provide the services and comply with law. Upon termination or Customer request, Vendor shall delete or return Patient Documents and Derived Data within [X] days, except for limited backup copies subject to rolling deletion and no operational access.

6. Security and Access Controls. Vendor shall maintain administrative, technical, and physical safeguards appropriate to the sensitivity of Patient Documents, including encryption in transit and at rest, role-based access, logging, least-privilege access, MFA for privileged accounts, and segregation of Customer data from training and analytics pipelines.

7. Subprocessors. Vendor may use Subprocessors only under written obligations at least as protective as this Addendum and shall remain fully responsible for their acts and omissions. Vendor shall provide advance notice of material Subprocessor changes.

8. Incident Notice. Vendor shall notify Customer without unreasonable delay, and in no event later than [48/72] hours, after confirming any unauthorized access, disclosure, or use of Patient Documents or Derived Data.

9. Audit and Attestation. Upon request, Vendor shall provide a written attestation that Patient Documents and Derived Data have not been used for model training, advertising, sale, or other prohibited uses. Customer may also request reasonable supporting documentation.

10. Survival. Sections 2 through 9 survive termination of the Agreement.

11. Order of Precedence. This Addendum supersedes any conflicting terms in the Agreement, online terms, privacy policy, acceptable use policy, or service documentation.

Clause-by-clause explanation for small clinics

Definitions must include outputs, not just uploads

Many templates protect “data” but forget the outputs created by AI systems. That omission matters because summaries, embeddings, and extracted fields can still reveal sensitive health information and can still be reused by the vendor. Your definition of Patient Documents should include scans, forms, images, and metadata, while Derived Data should capture outputs, vectors, and inferences. This is a common gap in vendor contracting, similar to how teams sometimes overlook hidden dependencies in cloud contracts; our guide on re-architecting cloud offerings explains why these hidden layers matter.

Permitted use should be narrow and operational

Permitted use language should map to actual service steps, not vague business purposes. For example, if the vendor performs OCR and auto-filing, say that. If it helps generate patient portal summaries, say that. If you do not want human review by the vendor’s staff except for support tickets, say that too. This structure reflects the same disciplined thinking used in clinical decision support design, where the allowed role of the system must be precisely scoped.

Prohibited use should list the exact risks

Do not rely on a generic “no misuse” clause. State the things you are trying to prevent: training, fine-tuning, sale, ad targeting, profiling, and combining data across customers. If your vendor offers a “service improvement” option, require a separate opt-in with a named dataset, purpose, duration, and opt-out mechanism. This creates a paper trail that helps protect the practice if a dispute arises later. For a parallel in another regulated setting, see our article on LLM auditing for hiring.

How to use the addendum with a BAA and vendor agreement

Attach it before signature, not after implementation

The best time to negotiate data use restrictions is before the first file is uploaded. If you sign, onboard, and only then ask for a contract change, vendors often treat it as a commercial exception rather than a baseline requirement. Send the addendum with your redlines during procurement, with the BAA, security questionnaire, and order form all aligned. This avoids a common problem where the BAA protects PHI but the main agreement still grants the vendor broad reuse rights for product analytics or model improvement.

Make the hierarchy explicit

Your documents should say which contract controls if there is a conflict. In most cases, the addendum should override the master services agreement, online terms, privacy policy, and help center articles. Otherwise, the vendor can quietly point to a public policy that gives itself broader rights than your negotiated paper. This is the same risk logic behind platform acquisition strategy and SaaS/PaaS/IaaS selection: governance has to be defined at the highest contractual level, not inferred from the UI.

Coordinate with security and compliance teams

If you have an IT consultant, MSP, or part-time compliance officer, make sure they review the addendum against the vendor’s technical controls. The legal text should reflect the real system architecture: where files are stored, which APIs are used, whether human reviewers can see raw content, and how deletion works in backups. Contract language without technical enforcement is weak, and technical controls without contract language are hard to audit. For a structured approach to vendor evaluation, our guide on co-leading AI adoption safely is a useful cross-functional model.

What to negotiate with third-party AI providers

Training and product-improvement carveouts

Ask whether the vendor uses your data for model training, fine-tuning, RLHF, evaluation, prompt logging, or human review. If the answer is “no,” ask for a contractual statement and a technical explanation of how the separation is enforced. If the answer is “sometimes,” narrow the exception to a named feature, with explicit opt-in and no retention beyond the feature window. The BBC report on health-data AI is a clear reminder that even when a provider says it will not use a specific chat for training, the surrounding product ecosystem may still be evolving.

Advertising and commercial profiling

Many small practices think advertising restrictions are only relevant for consumer apps, but they matter in healthcare too. If a vendor has an ads business, cross-sells other products, or builds audiences from usage data, your patient documents should be off-limits entirely. Your addendum should prohibit contextual, behavioral, or inferred advertising based on document content. That is especially important if the vendor’s business strategy is changing, because health data can become a valuable segmentation source over time. For a broader perspective on monetization pressure, see privacy-forward hosting.

Subprocessors, support, and cross-border handling

Vendors often outsource OCR, storage, monitoring, or customer support. Every one of those subprocessors can become a data leakage point if they are not contractually bound to the same restrictions. Ask for a subprocessors list, notice of changes, and written flow-down obligations. If data moves across borders, confirm where processing occurs and whether local law changes the privacy analysis. A good procurement workflow borrows from security and governance tradeoffs: fewer trusted handling points usually means fewer surprises.

Practical negotiation checklist

TopicWhat to ask forWhy it matters
Model trainingExpress prohibition on training, fine-tuning, and evaluation on patient documentsStops secondary AI reuse
AdvertisingNo ad targeting, audience building, or profiling from patient contentPrevents monetization of sensitive data
RetentionDeletion within a defined period after termination or requestLimits hidden long-term exposure
SubprocessorsAdvance notice, flow-down obligations, and vendor liabilityExtends protections through the chain
Audit rightsAttestation and reasonable supporting evidenceCreates verifiability, not just promises
Order of precedenceAddendum overrides online terms and privacy policiesPrevents hidden conflicts

If you are evaluating multiple products, use the same procurement rigor you would apply to choosing a signing stack or document platform. Our articles on signing provider cyber risk, multi-provider AI, and privacy-first hosting are helpful benchmarks for building a vendor scorecard.

Operational examples for small clinics

Example 1: solo dental practice using AI intake parsing

A solo dental office uses an AI document tool to read insurance cards and intake forms. Without a contract addendum, the vendor’s standard terms allow “service improvement” and “analytics.” That could permit the vendor to keep card images in logs, use them to improve recognition models, and combine them with other customers’ files. With the addendum attached, the office can allow OCR and routing while blocking training, sale, and advertising. This preserves convenience without turning the office’s patient paperwork into a dataset.

Example 2: urgent care center and chart summarization

An urgent care clinic wants AI to summarize discharge papers and scan results for staff triage. The vendor can be valuable if it only processes documents transiently and returns a summary to authorized users. But if the provider also wants to improve a general medical model from those summaries, the clinic should refuse unless the use is separately negotiated and legally reviewed. This is exactly the kind of boundary that a good contract addendum creates: operational automation without unintended data exhaust.

Example 3: specialty practice with a BAA and EHR connector

A specialty practice may already have a BAA with its EHR vendor and assume that downstream AI tools are covered. That is often not true. If the AI vendor is a separate subcontractor, the practice still needs direct contractual controls over the document workflow, especially if PHI is exported from the EHR into a third-party app. In that scenario, the addendum works like a privacy fence around the vendor agreement, while the BAA handles HIPAA responsibilities. For additional workflow context, see our guide on digital signature-enabled care workflows.

Common mistakes to avoid

Assuming “no training” equals safe

A vendor can avoid training while still using your content for support review, logging, troubleshooting, quality assurance, product analytics, or human-in-the-loop evaluation. That is why the addendum has to prohibit not only training but also sale, profiling, and internal product development unless explicitly approved. If the practical effect is that patient documents are still reused beyond the service, the promise is too weak.

Leaving derived data unprotected

If your contract only mentions uploaded files, the vendor may argue that its summaries, embeddings, classifications, or vector representations are its own internal work product. That interpretation can undermine your privacy position and make deletion requests incomplete. Always define Derived Data and apply the same restrictions to it. This simple drafting move closes a surprisingly common loophole.

Ignoring online terms and product toggles

Some vendors let administrators toggle “improve the service” or “share for model training” inside the admin console. Others bury it in updated online terms. Your addendum should say that no UI toggle or unilateral policy change can override the agreed restrictions unless the practice signs a written amendment. For digital workflow governance, our resource on safe AI adoption explains why process discipline matters as much as tool choice.

Implementation playbook for small practices

Step 1: classify your data and use case

Start by identifying exactly which documents will enter the AI workflow. Intake PDFs, referral letters, lab results, consent forms, and insurance documents should each be labeled as PHI or sensitive business data. Then determine whether the tool is used for storage, extraction, summarization, triage, or routing. This map tells you which clauses are mandatory and which business exceptions you can safely reject. For a workflow design analogy, the same discipline appears in data-flow-aware system design.

Step 2: redline the vendor agreement before procurement

Ask for the addendum to be incorporated before any pilot or trial use. If the vendor resists, treat that as a signal that the vendor may not be ready for sensitive health data. You are not being difficult; you are setting a boundary around patient trust and legal exposure. A vendor that cannot accept data-use restrictions is not a good fit for a clinic that handles protected documents.

Step 3: verify technical enforcement

Have the vendor explain how they separate production data from training data, how deletion works, and whether humans can review your content. Then compare the answers to the contract. The contract should reflect the system’s actual behavior, not aspirational marketing language. This is where a procurement approach inspired by cyber risk frameworks can be especially effective.

Pro Tip: Ask vendors to show you the exact product setting, backend workflow, or subprocessors list that supports their “no training” claim. If they cannot explain the mechanism, the contractual promise is weaker than it looks.

FAQ

Does this addendum replace a BAA?

No. A contract addendum limits how the vendor may use patient documents, but it does not automatically satisfy HIPAA or other healthcare privacy obligations. If the vendor handles PHI on behalf of the practice, you may still need a BAA and other compliance terms. The addendum works best as a companion document that closes AI-specific reuse gaps the BAA may not address.

Can a vendor still use de-identified data for model training?

Only if you explicitly allow it in writing and applicable law permits it. This guide recommends that small practices prohibit training on patient documents and Derived Data altogether unless a lawyer, privacy officer, and technical reviewer approve a narrowly scoped exception. De-identification language can be too easy to overstate, especially when modern AI systems can infer sensitive details from patterns.

What if the vendor says patient data is never stored?

That is helpful, but not enough. Transient processing can still involve logs, caches, support tools, exception handling, and human review. Your contract should still prohibit training, sale, advertising, and profiling, and it should define how transient data is handled if the tool needs to retain it briefly for service delivery.

Should the addendum cover embeddings and vector databases?

Yes. Embeddings and vectors are a form of Derived Data and can reveal sensitive information or be reused across products. If your vendor creates them from patient documents, they should be covered by the same use restrictions, retention limits, and deletion duties as the original files.

How do I ask for audit evidence without being unreasonable?

Start with a written attestation that the vendor does not use your data for training, sale, or advertising. Then ask for reasonable supporting documentation such as subprocessors, security controls, retention policy summaries, or a compliance letter. Small practices usually do not need a full on-site audit, but they do need enough evidence to trust the workflow.

Can this template be used for non-hipaa documents too?

Yes. The structure works for any sensitive document workflow where a vendor should not reuse files for model training or monetization. You may want to adjust the definitions to fit legal, HR, financial, or insurance documents, but the same core principles apply.

Bottom line

A modern clinic needs the efficiency of AI-assisted document workflows, but it cannot trade away patient trust in the process. A well-drafted contract addendum gives you a practical way to permit OCR, routing, summarization, and automation while blocking the most dangerous forms of secondary use. The key is to define patient documents broadly, restrict derived data, prohibit training and advertising, control subprocessors, and make the addendum override conflicting terms. Combined with a BAA, technical safeguards, and a disciplined vendor review process, this template can help small practices adopt AI more safely and confidently.

For further reading on related governance and workflow topics, explore our internal resources on third-party signing risk, avoiding AI vendor lock-in, and privacy-first service design. If your practice is digitizing more records, our guide on digital signatures and online documents in care settings is a natural next step.

Advertisement

Related Topics

#legal#contracts#privacy
J

Jordan Ellis

Senior Compliance Editor

Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.

Advertisement
2026-04-16T16:30:35.927Z