Skip to main content

Command Palette

Search for a command to run...

A Fortune 500 HR Buyer Just Asked What Their Own EU AI Act Obligations Are as Your Deployer: How to Answer the Article 29 Questions

Updated
4 min read

A Fortune 500 HR Buyer Just Asked What Their Own EU AI Act Obligations Are as Your Deployer: How to Answer the Article 29 Questions

The email arrived from a VP of People Operations at a 22,000-employee consumer goods company in Frankfurt. Your AI-assisted recruitment platform had been in procurement review for two months. The deal was at legal sign-off stage. Then this:

"Before we execute this agreement, our legal team needs to understand what obligations we take on as the 'deployer' of your AI system under the EU AI Act. Can you provide a summary of what Article 29 requires of us, and confirm that your platform supports us in meeting those obligations?"

This is a new kind of question. The buyer is not asking about your system's compliance documentation. They are asking about their own regulatory obligations — and they want you to help them understand those obligations as a condition of signing.

It is a reasonable ask. The EU AI Act creates distinct obligations for both providers (you) and deployers (your customers). Most enterprise procurement teams have noticed this split and are now asking vendors to help them map it.

Here is how to answer.

What Article 29 Actually Requires of Your Customer

Article 29 sets out the obligations that deployers of high-risk AI systems must meet. For an HR tech customer deploying your AI recruitment or performance management tool, the core obligations are:

Use the system according to your instructions for use. Article 29(1) requires the deployer to use the high-risk AI system in accordance with the instructions for use that you, as the provider, have supplied. This is the most direct obligation — and the most frequently overlooked. If you have provided a user guide specifying the intended user population, the contexts of use, and the limitations of the system's outputs, your customer is obligated to follow it.

Assign human oversight. Article 29(4) requires the deployer to assign human oversight of the AI system to natural persons with sufficient competence, training, and authority. For an HR deployment, this means the customer must designate HR managers who are trained to interpret the AI's outputs critically — not just accept them as decisions.

Suspend use when risks arise. Article 29(5) requires the deployer to inform the provider immediately if they identify a risk to health, safety, or fundamental rights — and to suspend use if appropriate pending your response.

Conduct a Fundamental Rights Impact Assessment (FRIA). Under Article 27 and Article 29(4)(c), deployers of high-risk AI systems in employment contexts must conduct an assessment of the impact on fundamental rights before deploying the system. This is the deployer's obligation, not yours — but they need your technical documentation to complete it.

Maintain logs. Article 29(5) requires the deployer to keep the automatically generated logs of the system for a defined period (typically three years for employment AI, to align with employment dispute statute of limitations).

What You Need to Provide to Help Them Comply

Your customer cannot meet their Article 29 obligations without documentation from you. The specific items:

Instructions for use (required by Article 13(1)): A document describing the system's intended purpose, performance characteristics, known limitations, and the competencies required to interpret its outputs. This is the anchor document for their Article 29(1) obligation.

Technical documentation (Article 11, Annex IV): The full technical file. Your customer's legal team will cite it in their FRIA. Their procurement team will file it.

Human oversight guidance: A description of the oversight workflow your system supports — what human review is required before acting on an AI output, and how your interface makes that review practical.

Log access or export: Confirmation that deployers can access and retain the system's automatically generated logs for their required retention period.

If you have all of these ready, responding to this procurement question takes an hour. If you do not, you are not ready for enterprise EU sales.

The Practical Answer to Send

Your response to the VP of People Operations can be structured as follows:

First, confirm that your system is classified as high-risk under Annex III Category 4 (or provide your classification rationale if it is not). This tells their legal team which regulatory track applies.

Second, list the Article 29 obligations that apply to them as a deployer, in plain language: use the system per your instructions, maintain human oversight, keep logs, conduct a FRIA.

Third, attach or link: your instructions for use, your technical documentation summary, your human oversight workflow description, and confirmation of log retention capability.

Fourth, offer to support their FRIA with a documentation briefing. Most enterprise legal teams have never done a FRIA. The vendor that helps them through it tends to close faster.

The Follow-Up They Will Ask About Next

After Article 29, the next legal question is usually about Article 22: what happens if they, as deployer, identify a risk or malfunction and need to report it? Who do they notify, in what timeframe, and what is your role as provider in supporting that notification?

That answer belongs in your incident coordination procedure — which should be in your technical documentation under Article 9.

Try Complizo free at complizo.com

More from this blog

Complizo

68 posts