A University Just Asked How Your AI Recommendation Engine Uses Student Data: Answering the Article 10 and GDPR Article 22 Section
A University Just Asked How Your AI Recommendation Engine Uses Student Data: Answering the Article 10 and GDPR Article 22 Section
The email arrived from the University of Amsterdam procurement office at 8:47 AM. Your edtech sales cycle is 14 weeks in. The contract is ready. Then comes Section 7: "Under the EU AI Act Article 10, describe the personal data categories processed during model training and inference. Under GDPR Article 22, explain whether your system makes solely automated decisions with legal or similarly significant effects on students. Provide evidence that Article 22 safeguards are implemented."
You have five business days to answer. Here's exactly what to say.
Why This Section Is Different From Other Questionnaire Sections
Most AI compliance questionnaire sections ask about risk classification, documentation, or human oversight. The Article 10 + GDPR Article 22 combination is different — it asks about data and decision-making rights simultaneously.
EU AI Act Article 10 covers the data governance requirements for training and inference data in high-risk AI systems. GDPR Article 22 gives individuals (including students) the right not to be subject to solely automated decisions that produce legal or similarly significant effects — and requires safeguards if such decisions are made.
When a university procurement team asks these questions together, they're probing for the most legally consequential combination in edtech: an AI system that processes student data and influences outcomes that affect the student's educational path.
Answering Article 10: Data Governance for Training and Inference
What they're actually asking
Article 10 requires that data used to train and test high-risk AI systems meets specific governance standards: relevance, representativeness, freedom from errors, completeness, and documented bias mitigation measures.
How to answer
Break your response into training data and inference data.
Training data:
- What categories of student data were used (e.g., module completion rates, assignment submission timestamps, prior academic records, demographic attributes)
- Whether the training population is representative of the intended deployment population (e.g., if your model was trained on students at one university type, are you deploying at another?)
- What bias mitigation steps were taken during data preparation (e.g., upsampling underrepresented groups, excluding protected characteristics from feature sets, fairness constraint testing)
- Who had access to training data and under what legal basis (typically: legitimate interest or research exception under GDPR Article 9(2)(j) for special category data)
Inference data (runtime):
- What student data is processed at inference time (e.g., current module access patterns, time-on-task, quiz performance)
- Is any special category data processed (disability status, mental health indicators, nationality)? If yes, the legal basis and Article 9 condition must be stated explicitly.
- How long inference inputs are retained and whether individual-level data persists beyond the session
The answer the procurement team needs to file
Give them a one-page data inventory table:
| Data category | Training / Inference / Both | Special category (Art. 9)? | Legal basis | Retention |
| Module completion rate | Both | No | Legitimate interest | [N] days inference, [N] years training |
| Prior academic records | Training only | No | Research exception Art. 9(2)(j) | [N] years |
| Disability accommodations | Inference only | Yes | Explicit consent | Session only |
Universities file this table directly into their vendor risk register. You close the section.
Answering GDPR Article 22: Automated Decision-Making Safeguards
What they're actually asking
Article 22(1) prohibits decisions "based solely on automated processing" that produce "legal or similarly significant effects" on a person — unless safeguards apply. For students, "similarly significant" effects include: course recommendations that affect program completion, learning-path restrictions, or academic intervention triggers.
The question is binary: does your system produce outputs that directly determine something with this kind of effect? And if so, how?
How to answer
Option A: Your system is advisory, not determinative
If your AI recommendation engine produces suggestions that a faculty member, advisor, or the student themselves act on — but no decision is made automatically without a human step — you can credibly answer:
"Our system does not make solely automated decisions within the meaning of GDPR Article 22(1). Every recommendation is presented to [the student / the faculty advisor / the academic progress team] and requires human confirmation before any enrollment, grading, or support referral action is taken. No output from our system is automatically applied to a student record."
Describe the human step specifically. "Requires human confirmation" is a legal safeguard only if it's real — a genuine opportunity to review and override, not a click-through.
Option B: Your system does trigger automated actions
If any output from your AI automatically changes something in a student record — even a low-stakes action like unlocking a module, moving a student to a different learning track, or generating an academic risk alert that triggers automated notification — then Article 22 applies.
In this case, you must describe the three Article 22(2) safeguards:
- Explicit consent or contractual necessity — the legal basis for the automated decision
- Human review on request — the student must be able to trigger human review of any decision
- Meaningful contest mechanism — the student must be able to challenge the decision
For each automated action your system triggers, describe all three. The university's GDPR officer will need this to document compliance.
The clause universities want in the contract
After your narrative answer, offer this as a Data Processing Agreement annex:
"Provider confirms that no output of the AI system is applied to a student record without an intervening human review step, unless the university's designated system administrator has explicitly configured automated actions in [product settings], in which case the university acknowledges it is the Article 22 data controller for those decisions and has implemented the required safeguards."
This clause transfers the Article 22 controller responsibility clearly when automated actions are university-configured — and protects you when they're not.
Putting It Together
The university procurement team isn't asking you to solve their GDPR compliance. They want to verify that deploying your system won't expose them to a GDPR enforcement action or student complaint. If you can hand them:
- A data inventory table (Article 10)
- A clear statement on whether Article 22(1) applies and why (or why not)
- If it does apply — evidence of the three safeguards
- A contract clause that maps Article 22 controller responsibility
— they'll forward your response to their DPO, get sign-off, and counter-sign.
Try Complizo free at complizo.com — paste your first questionnaire and get Article-traceable answers in minutes.