Skip to main content

Command Palette

Search for a command to run...

The EU AI Act Deadline Is Less Than 4 Months Away — Here's What Your Customers Will Ask

Published
6 min read

Your biggest customer just sent over a new vendor questionnaire. Page three has a section you haven't seen before: "AI Act Compliance."

You stare at it. Questions about risk classification, Annex III, conformity assessments, human oversight mechanisms. You built a great SaaS product. You didn't build a compliance department.

Sound familiar? You're not alone. According to a recent readiness report, 78% of enterprises have not taken meaningful steps toward AI Act compliance. And the deadline — August 2, 2026 — is now less than four months away.

Here's what you need to know, and more importantly, how to answer the questions your customers are about to ask.

What Happens on August 2, 2026?

The EU AI Act's remaining provisions become fully enforceable. That means:

  • High-risk AI system requirements kick in. If your product falls under Annex III categories (employment, credit scoring, education, critical infrastructure), you must comply with rules on risk management, data governance, technical documentation, transparency, human oversight, accuracy, and cybersecurity.

  • Deployer obligations are active. Your enterprise customers who use your AI-powered product in the EU are "deployers" under the Act. They're responsible for compliance — and they'll push that responsibility upstream to you, their vendor.

  • Market surveillance begins. National authorities can investigate, audit, and fine. Penalties reach up to €35 million or 7% of global annual turnover, whichever is higher.

  • Extraterritorial scope applies. If your product produces outputs used in the EU or affects EU-based individuals, you're in scope — even if your company is headquartered in San Francisco, Tel Aviv, or Bangalore.

The bottom line: your EU customers can't buy from you unless you can prove compliance. And they'll prove due diligence by asking you pointed questions.

The 7 Questions Your Customers Will Ask

Based on the EU's model contractual clauses for AI procurement (MCC-AI) and real procurement questionnaires we've seen, here are the questions heading your way:

1. "What is the risk classification of your AI system?"

They need to know if your product is high-risk, limited-risk, or minimal-risk under the AI Act. High-risk systems (Annex III) face the strictest requirements. If you use AI for hiring decisions, credit scoring, or student assessment, you're almost certainly high-risk.

How to answer: State your classification clearly. Reference the specific Annex III category if applicable, or explain why your system falls outside high-risk scope.

2. "Do you have a risk management system in place?"

Article 9 requires a documented, ongoing risk management process for high-risk systems. Your customers need to see that you've identified risks, tested mitigations, and have a process for continuous monitoring.

How to answer: Describe your risk management framework, including how you identify and mitigate risks related to health, safety, and fundamental rights.

3. "What data governance practices do you follow?"

Article 10 covers training, validation, and testing data. Customers want to know your data is relevant, representative, and free from bias.

How to answer: Explain your data sourcing, quality controls, bias testing procedures, and how you handle personal data under GDPR alongside the AI Act.

4. "Can you provide technical documentation?"

Article 11 requires comprehensive technical documentation that proves your system meets AI Act requirements. This isn't optional — it's a prerequisite for the conformity assessment.

How to answer: Confirm you maintain technical documentation covering system design, development methodology, risk management, and performance metrics.

5. "What transparency measures do you provide?"

Articles 13 and 50 require that deployers (your customers) can understand your AI system's capabilities, limitations, and intended purpose. They need clear instructions for use.

How to answer: Point to your user documentation, explain how your system communicates its AI-generated outputs, and describe any disclosure mechanisms.

6. "What human oversight mechanisms are built in?"

Article 14 requires that high-risk systems can be effectively overseen by humans. Your customers need to show their regulators that a person can intervene, override, or shut down the AI.

How to answer: Describe the human-in-the-loop or human-on-the-loop controls in your product, including override capabilities and alert systems.

7. "Have you completed a conformity assessment?"

For many Annex III systems, you need to complete a conformity assessment before August 2, 2026. Some categories require third-party assessment; others allow self-assessment.

How to answer: State your conformity assessment status, the method used (self-assessment or notified body), and when it was completed or is expected.

Why This Matters for Your Sales Pipeline

This isn't just a legal checkbox. It's a sales blocker.

Enterprise procurement teams are already adding AI Act compliance sections to their vendor questionnaires. If you can't answer these questions clearly and quickly, you lose the deal. Your competitor who can answer them wins.

The math is simple: 83% of organizations don't even have a formal inventory of their AI systems yet. If you get ahead of this, you're in a small minority of vendors who make the procurement team's life easy.

How to Get Ready Before August 2

You don't need a team of lawyers or a six-month compliance project. You need to:

  1. Know your risk classification. Map your AI features to the AI Act's categories. This takes an afternoon, not a quarter.

  2. Prepare your answers. Draft clear, specific responses to the seven questions above. Reuse them across every customer questionnaire.

  3. Build your documentation. Technical documentation, risk management records, and transparency disclosures should live in one place, ready to share.

  4. Automate the repetitive parts. You'll get the same questions from different customers, phrased slightly differently. Having a consistent answer engine saves hours per questionnaire.

That last point is exactly what Complizo does. Paste your customer's questionnaire, get accurate, EU AI Act-aligned answers in minutes — not weeks.

Try Complizo free — paste your first questionnaire

The Clock Is Ticking

August 2, 2026 is not a soft launch. It's the day enforcement begins, fines become real, and procurement teams start rejecting non-compliant vendors.

You have less than four months. The questions are coming. The only question is whether you'll be ready with answers.


Complizo is the AI-powered questionnaire answer engine for EU AI Act compliance. Paste a questionnaire, get accurate answers. No jargon, no six-month projects.

More from this blog

Complizo

17 posts