IDPC Participates on MCAST Panel on AI in Education
IDPC Participates on MCAST Panel on AI in Education
11 November 2025
Dr Marco Fagnano, Legal Counsel at the Office of the Information and Data Protection Commissioner (IDPC) participated in a panel organized by MCAST on 16 October 2025, which focussed on Artificial Intelligence (AI) in the Education sector, which brought together academics and regulatory personnel. The discussion centred on the implications of the EU Artificial Intelligence Act (AI Act), and the need to safeguard essential fundamental rights of students, notably as protected by the General Data Protection Regulation (GDPR), particularly in the context of AI systems which may be deployed for purposes of student assessment, progression and learning outcomes.
Dr Fagnano highlighted the importance of ensuring that AI tools used in academic settings — such as those for evaluating student performance or supporting learning outcomes — adhere to privacy and data protection safeguards, provided by the GDPR, which are a noteworthy source in the emerging AI Act framework, for protecting students, where algorithms may be deployed, which may take decisions affecting their future or livelihood, on the basis of the processing of their personal data.
A key point of discussion highlighted by Dr Fagnano centred on the GDPR right not to be subject to decisions based solely on automated processing that produce legal or similarly significant effects. Dr Fagnano emphasized the close alignment between this principle under the GDPR and the AI Act’s requirement for meaningful human oversight, noting that both instruments underscore the importance of maintaining human judgment and accountability in automated or algorithmic decision-making processes.
Dr Fagnano also underlined that, since AI models intended to be deployed in the education sector are considered as high-risk AI systems in light of the forgoing concerns, educational institutions such as MCAST, which intend to integrate AI into assessment and teaching, must undertake a fundamental rights impact assessment (FRIA) prior to deployment of AI, and to factor in fair and transparent governance considerations and measures to safeguard students’ rights and ensure compliance with data protection and AI obligations.
Dr Fagnano stressed that, since the underlying concerns affect data protection rights, a data protection impact assessment (DPIA) is also highly advisable - and highlighted that incidentally, the AI Act requires that a FRIA is required where a DPIA is required, and that the FRIA must compliment the DPIA.
