Provider or Deployer? The EU AI Act Question Your Customer Is Actually Asking (and How to Answer It)
Provider or Deployer? The EU AI Act Question Your Customer Is Actually Asking (and How to Answer It)
The questionnaire arrived with a question in Section 1 that the founder had never seen before.
"Under the EU AI Act, does your company consider itself an AI provider, an AI deployer, or both? Please explain your reasoning with reference to the relevant Articles."
The founder ran a SaaS company. They had built the product. They were selling it to enterprise customers. But "provider or deployer" — which one were they exactly?
They looked it up. Found two conflicting blog posts. Spent two hours going in circles. Eventually wrote something vague and moved on.
Three weeks later, the same question appeared in a different questionnaire from a different customer — phrased differently but asking the same thing. The answers they sent were inconsistent.
This question is appearing on enterprise procurement questionnaires right now. Here is the definitive answer, and why getting it right on the first questionnaire matters for every answer that follows.
The Definitions (And Why They Matter)
The EU AI Act draws a clear line between two roles under Article 3.
An AI provider (Article 3(3)) is any entity that develops an AI system — or has it developed — and places it on the market or puts it into service under its own name or trademark. If you built the AI, trained the model, and are selling or licensing it to customers, you are a provider.
An AI deployer (Article 3(4)) is any entity that uses an AI system under its own authority in a professional context. If you are taking an AI system built by someone else and incorporating it into your own product or service, you are a deployer.
Why does this distinction matter? Because the compliance obligations differ — significantly.
Providers of high-risk AI systems must:
- Maintain technical documentation (Article 11)
- Implement a quality management system (Article 17)
- Register the system in the EU AI database (Article 49)
- Draw up an EU Declaration of Conformity (Article 47)
- Implement post-market monitoring (Article 72)
- Report serious incidents to national authorities (Article 73)
Deployers of high-risk AI systems must:
- Ensure appropriate human oversight of AI-assisted decisions (Article 26)
- Maintain logs of AI use (Article 26(5))
- Ensure transparency to affected individuals when legally required (Article 26(11))
- Not modify a high-risk AI system beyond its intended purpose without re-evaluation
The obligations are different. If you answer the provider/deployer question incorrectly, every downstream answer about your obligations will be wrong — and a careful procurement team will notice the inconsistency.
Most B2B SaaS Companies Are Providers
If you have built an AI feature into your product — even a single AI-powered feature — and you are selling or licensing that product to business customers, you are an AI provider under the EU AI Act. Full stop.
You built the AI. You put it in your product. You put your product on the market under your name. That is the provider definition.
This applies even if:
- The AI feature is powered by a third-party model (an LLM API, a computer vision service, a scoring API). You are still the provider relative to your customers.
- You call it an "AI-assisted" feature rather than "AI." The Act applies to systems that use machine learning, deep learning, or statistical approaches to generate outputs — not systems labelled "AI."
- You are a small company. The EU AI Act applies to companies of all sizes.
If you use a third-party AI model (an OpenAI API, a Hugging Face model, a third-party scoring engine), you are simultaneously a deployer relative to that model's provider, and a provider relative to your own customers. Both/and — not either/or.
What the Questionnaire Is Actually Asking
When a customer asks "provider or deployer?", they are doing two things at once.
First, they are trying to understand what obligations sit with you. If you are the provider of a high-risk AI system, they want to know that you have technical documentation, post-market monitoring, and incident reporting in place — because that is what will satisfy their own regulators if your product gets audited through them.
Second, they are trying to understand their own obligations. A deployer of a high-risk AI system has specific duties under Article 26. They want to understand what those duties are relative to your product — and they want you to tell them, not their lawyers.
How to Answer This Question
Here is a template that works for most B2B SaaS companies with AI features built in-house (or via API that they control):
"[Company name] operates as an AI provider under the EU AI Act (Article 3(3)). We develop the AI system, maintain and update the underlying model, and place the system on the market under our own brand.
As the provider, we are responsible for:
- Maintaining technical documentation (Article 11)
- Implementing a quality management system (Article 17)
- Post-market monitoring and performance reporting (Article 72)
- Registration in the EU AI database for high-risk systems (Article 49)
Our customers operate as deployers under Article 3(4) in that they deploy and use the AI system within their own professional context. Deployer responsibilities that apply to our customers include: ensuring appropriate human oversight of AI-assisted decisions (Article 26), maintaining logs of use where required, and ensuring transparency to affected individuals in accordance with applicable law.
We provide customers with the technical documentation, monitoring reports, and support materials needed to fulfill their deployer obligations and to respond to regulatory inquiries about AI tools in their stack."
Adapt this for your specific situation. If you are a fintech company using a credit-scoring model from a third-party provider, add a sentence noting that for the third-party model you are a deployer, and describe what due diligence you conduct on that provider.
The Follow-Up Questions
Once you answer "provider," three follow-up questions arrive almost immediately in the same questionnaire:
1. Is your AI system high-risk under Annex III? This requires you to look at the Annex III list and take a position. Annex III covers 8 categories including biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration, and administration of justice. If your product falls in any of these verticals, you are almost certainly high-risk.
2. What technical documentation do you have? Article 11 specifies what belongs in technical documentation. Buyers ask because they may need to produce it to a regulator. "Documentation available upon request" is no longer sufficient — name what you have specifically.
3. What is your post-market monitoring process? Article 72 requires high-risk AI providers to monitor the performance of deployed systems. Buyers want specifics: what metrics, what frequency, what threshold triggers review, and how are customers notified of issues.
Getting the provider/deployer answer right means all three follow-ups can be answered consistently, because the framework for your answers is established.
Why Section 1 Sets the Tone for the Entire Deal
Procurement teams notice when answers contradict each other. Section 1 establishes your role. If Section 1 says "deployer" and Section 4 describes provider-level monitoring and documentation, the inconsistency signals either that you don't understand your own obligations, or that you are not answering carefully.
Either signal is bad.
The provider/deployer question is the foundation. Every answer about documentation, testing, incident reporting, and customer obligations builds on it. Answer it correctly the first time — and the same way every time.
Complizo starts with role classification and risk classification before you answer a single questionnaire question. Your role is established once. Every answer that references your obligations is framed correctly — and consistently — across every deal.
Try Complizo free — paste your first questionnaire.