Skip to main content

Command Palette

Search for a command to run...

EU AI Act for Fintech SaaS: The AI Compliance Questions Your Banking Customers Are About to Send

Updated
6 min read

EU AI Act for Fintech SaaS: The AI Compliance Questions Your Banking Customers Are About to Send

The email arrived on a Tuesday afternoon.

It was from a compliance officer at a regional bank — one of the fintech SaaS company's biggest customers. Attached was a 72-question AI compliance questionnaire. At the top: "Please complete the attached document by end of month. This is required for our annual vendor review under EU AI Act Article 28."

The founder had been expecting a questionnaire eventually. But not this one. Not 72 questions. And not from their largest account.

If you sell B2B SaaS to financial institutions — credit scoring tools, fraud detection, loan underwriting assist, KYC automation, financial analytics with AI features — this scenario is arriving in inboxes now.

Here is what the questionnaire contains, and how to answer it.

Why Financial Services Sends the Most Demanding Questionnaires

Banks and financial institutions are regulated on two sides simultaneously. Their own regulators — the EBA, ECB, and national supervisory authorities — expect them to conduct AI due diligence on every vendor they deploy AI from. And their customers — borrowers, account holders — may have individual rights under the EU AI Act when AI affects decisions about them.

So when a bank's compliance team sends you a questionnaire, it is longer and more detailed than what a technology company would send. They have already internalized the EU AI Act framework and they are applying it to your product.

Financial services AI tools frequently fall under Annex III high-risk classification when used in:

  • Credit scoring or assessment of creditworthiness
  • Risk assessment in insurance
  • Fraud detection that affects individual customers
  • Any AI that makes or influences decisions about access to financial services

High-risk classification triggers the full set of provider obligations — and the full set of due diligence questions from the deployer.

The 5 Questions Your Financial Services Buyers Will Ask

Q1: What is the scope and intended purpose of your AI system?

This sounds broad, but buyers need a precise answer they can insert into their own AI inventory. They need to know: what decision does your AI inform, what is the input, what is the output, and where in their workflow your output is consumed.

How to answer: Describe your AI in two to three sentences that specify the task (classification, scoring, recommendation, prediction), the input data type, the output format, and how the output is used. Avoid marketing language. Be literal.

Example answer: "Our AI system analyzes structured applicant financial data — income history, existing debt, payment behavior — to generate a creditworthiness score between 0 and 1000. The score is one input into loan officer underwriting decisions. The AI does not make final approval or rejection decisions; it surfaces a score and a ranked list of contributing factors for human review."

Q2: Is your AI system high-risk under Annex III, and what obligations follow?

Financial institutions need to know your risk classification position so they can document it in their own AI governance register.

How to answer: Take a position. Name the Annex III entry that applies (or explain why it does not apply). Then describe what obligations you are fulfilling as a result.

Example answer: "We classify our system as high-risk under Annex III, Point 5(b) of the EU AI Act, which covers AI used in creditworthiness assessment and credit scoring. As a high-risk AI provider, we maintain technical documentation under Article 11, implement a quality management system under Article 17, conduct post-market monitoring under Article 72, and will register the system in the EU AI database under Article 49 ahead of the August 2, 2026 deadline."

Q3: How do you ensure explainability for AI-influenced financial decisions?

This question comes with regulatory backing. Under certain implementations of GDPR Article 22, individuals subject to automated decisions have rights — including the right to an explanation. Financial regulators in Germany, France, and the Netherlands have specifically called for explainability in AI credit decisioning.

How to answer: Describe the explainability mechanism in your product concretely. What does the output include beyond a score? Are contributing factors expressed in plain language? Is there an audit log?

Example answer: "Our system outputs a score and up to five contributing factors ranked by impact, expressed in plain language — for example, 'Debt-to-income ratio above the threshold for this risk band.' These factors can be provided to affected individuals upon request via our reporting export. All decisions are logged with timestamp, model version, input hash, and output, and logs are retained for [X years] to support regulatory audit."

Q4: How do you monitor for model drift?

Banks need to know your system will perform accurately next year, not just today. Model drift — where accuracy degrades as real-world data patterns shift — is a known risk in credit and fraud AI, especially during economic volatility.

How to answer: Describe your post-market monitoring cadence with specific metrics and specific thresholds. Vague answers ("we monitor continuously") fail at this level of buyer.

Example answer: "We monitor model performance on a monthly basis using [specific metrics — e.g., Gini coefficient, KS statistic, Population Stability Index for input distribution]. When we detect significant drift — defined as PSI above 0.2 or Gini decline greater than 5 percentage points — we trigger an investigation and retraining cycle within [X] business days. Customers are notified of all model version updates and receive the performance metrics of new versions before deployment."

Q5: What technical documentation is available if our regulators ask for it?

Financial regulators may audit your customer's use of AI. Your customer may be required to produce documentation about your system. This is Article 11 and it is non-negotiable at this buyer level.

How to answer: List what you have, specifically. A vague reference to "documentation available upon request" does not satisfy a compliance officer.

Example answer: "We maintain EU AI Act Article 11-compliant technical documentation including: system architecture overview and data flow diagram, training data description and quality assessment including demographic audit, model validation reports including out-of-time testing and backtesting results, bias and fairness analysis, ongoing monitoring reports, and a change log with version history. This documentation package is available to enterprise customers and can be shared with their regulators under a standard data sharing agreement."

Why These Questionnaires Are Landing Now

Financial institutions have been preparing for the EU AI Act since the rules were finalized in 2024. The August 2, 2026 high-risk AI deadline has been on their roadmap for 18 months. Their compliance teams are not waiting until August. They are running vendor audits now — six to nine months ahead of the deadline — so that any gap in vendor compliance can be remediated before regulators start asking.

If you sell into financial services and have not received a questionnaire yet, it is coming. The institutions that move first are the most sophisticated buyers — the ones you most want to keep.

One Answer Set, Every Banking Deal

The five questions above will appear — in different wording, across different questionnaires — in every financial services deal you close this year. A large bank's questionnaire will be 72 questions. A fintech platform will send 15. A credit union may send 8. The core questions are the same.

Complizo stores your answers to these questions against the specific product controls that back them. When Q3 on explainability arrives again next month from a different institution, your answer is already there — consistent with every prior answer, linked to the specific feature that generates the contributing factors.

Try Complizo free — paste your first questionnaire.

More from this blog

Complizo

17 posts