Why Your AI Compliance Answers Need to Be Identical Every Time
The EU AI Act doesn't just require you to classify your AI systems. It requires you to prove compliance — on paper. If your AI system falls under Annex III (high-risk), you need a specific set of documentation ready before the August 2, 2026 deadline. No documents, no compliance. No compliance, fines up to €35 million or 7% of your global annual turnover.
Here's exactly what you need to prepare — and what each document actually covers.
1. Model Cards
A Model Card is a structured summary of what your AI system does, how it was trained, and where it performs well (or doesn't). Think of it as your AI system's ID card.
What to include:
- The intended purpose and use cases
- Training data sources and methodology
- Known limitations and failure modes
- Performance benchmarks across different populations
Regulators want to see that you understand your own system. If you can't describe what your model does and where it breaks down, that's a red flag in any audit.
2. Data Governance Records
Article 10 of the EU AI Act sets strict rules around training, validation, and testing data. Your Data Governance Record proves you followed them.
What to include:
- Data collection methods and sources
- How you handled bias detection and mitigation
- Data quality metrics and validation procedures
- Data retention and deletion policies
This is where many SMBs get stuck. You may be using third-party models or pre-trained systems — but you still need to document the data practices behind them as far as reasonably possible.
3. Risk Management Records
Article 9 requires a risk management system that runs throughout the entire lifecycle of your AI system — not just a one-time assessment.
What to include:
- Identified risks and their severity ratings
- Mitigation measures implemented
- Residual risk assessment
- Ongoing monitoring procedures
The key word is "ongoing." Your risk management documentation needs to show a living process, not a PDF you created once and forgot about.
4. Human Oversight Protocols
High-risk AI systems must be designed so humans can effectively oversee them. Your Human Oversight Protocol documents exactly how that works in practice.
What to include:
- Who is responsible for oversight (roles and qualifications)
- What controls are in place to intervene or override
- When and how human review is triggered
- Training requirements for oversight personnel
This matters especially if your AI system makes or influences decisions about people — hiring, credit, insurance, or public services.
5. Conformity Assessments
Before placing a high-risk AI system on the EU market, you need a conformity assessment proving it meets all Chapter III, Section 2 requirements. For most Annex III systems, this is a self-assessment (no third-party auditor required).
What to include:
- Evidence of compliance with each applicable requirement
- Reference to harmonised standards applied
- Test results and validation data
- The EU Declaration of Conformity (a formal statement you sign)
The conformity assessment pulls together evidence from all your other documents. It's the capstone of your compliance package.
6. Transparency Notices
If your AI system interacts with people, generates content, or processes biometric data, you need a Transparency Notice. Articles 50 and 52 set out specific disclosure obligations.
What to include:
- Clear disclosure that an AI system is in use
- What the system does and how it affects the user
- How users can contest or seek review of AI-driven decisions
- Contact information for the responsible party
Transparency isn't optional, even for limited-risk systems. If people interact with your AI, they have the right to know.
How SMBs Can Actually Get This Done
Here's the reality: most SMBs don't have a compliance team. They don't have €10,000 for a consultant to prepare these documents. And they can't afford to ignore the deadline.
That's the problem Complizo was built to solve. You register your AI systems, classify them under Annex III, and Complizo auto-generates all six document types as audit-ready PDFs. Setup takes 5 minutes, and the free plan covers up to 3 AI systems — no credit card required.
If you're not sure whether your systems qualify as high-risk, start with a free risk classification to find out where you stand.
Don't Wait for the Deadline
The August 2, 2026 deadline for Annex III high-risk AI obligations is less than 131 days away. Yes, the EU Council proposed a potential extension to December 2027 through the Digital Omnibus — but that proposal hasn't passed Parliament yet, and trilogue hasn't started. Treating it as a sure thing is a gamble no SMB should take.
Start documenting now. The earlier you begin, the less painful the audit will be.
Complizo is a self-service EU AI Act compliance platform for SMBs. It does not provide legal advice. For the full text of the regulation, visit EUR-Lex.