Interview with MLex - Malta’s privacy watchdog set to become market surveillance authority for AI Act

Interview with MLex - Malta’s privacy watchdog set to become market surveillance authority for AI Act

15 May 2025

The Information and Data Commissioner (IDPC) Ian Deguara was interviewed by Matthew Newman of regulatory news portal MLex during the recently-held Privacy Symposium in Venice. During the interview, Mr Deguara said that Malta's Information and Data Protection Commissioner will be the market surveillance authority for the EU’s AI Act. The IDPC, which is already in charge of fundamental rights under the AI Act, will be named as market surveillance authority on Aug. 2 for certain high-risk AI systems. To prepare for the two new roles, the authority is building up its expertise and will provide “comprehensive guidance,” Deguara said.

This is the full text of the MLex interview article:

Malta's Information and Data Protection Commissioner will be the market surveillance authority for the EU’s AI Act, the authority's head Ian Deguara has told MLex.

The IDPC, which is already in charge of fundamental rights under the AI Act, will be named as market surveillance authority on Aug. 2 for certain high-risk AI systems under the law’s annex three, Deguara said.

“I am preparing for this responsibility by ensuring that we introduce the necessary legislative amendments and build the required expertise and have the appropriate tools to effectively fulfil the new role of the market surveillance authority,” he said on the sidelines of a conference* in Italy.

To prepare for the two new roles, the authority is building up its expertise and will provide companies with “comprehensive guidance” on developing or deploying AI technologies.

“As we are seeing an exponential rise in the use of AI, it is essential that businesses understand the data protection risks which are presented by the processing activities,” Deguara said.

Under the AI Act, market surveillance authorities will conduct regular audits and monitoring. They will allow AI providers to voluntarily report any serious incidents or breaches of fundamental rights obligations.

Starting on Aug. 2, high-risk systems face strict regulations. Before placing a high-risk AI system on the market, providers must perform a conformity assessment and register the AI system in a central EU database. Once they are on the market, the provider must continuously update the conformity assessment.

Examples of high-risk systems include AI used in biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration and justice.

In Malta, there are other nine authorities that have been designated as fundamental rights authority. The Malta Digital Innovation Authority is an additional market surveillance authority.

Deguara said it’s “fundamental” that these authorities work together on probes into breaches of the AI Act.

“My idea is to promote the establishment of a national committee, composed of the relevant actors that can meet on a periodic basis to share information and experience gained in addition to signing bilateral memorandums of understanding,” he said.

— GDPR simplification —

The European Commission plans to propose a simplification of the General Data Protection Regulation by the end of the year. The measure will focus on record-keeping obligations for businesses with fewer than 500 people, while maintaining the GDPR's core principles.

Currently, the GDPR exempts companies with fewer than 250 employees from having to keep a register of their processing activities unless processing of personal data is a regular activity, poses a threat to individuals’ rights and freedoms, or concerns sensitive data or criminal records.

The commission is considering amending the GDPR’s Article 35 to allow an exemption from reporting if the processing is “likely to result in a high risk to the rights and freedoms of natural persons.” The current measure refers to processing “likely to result in a risk.”

In addition, the commission may delete references to “occasional processing” and “special categories of personal data,” which includes sexual, religious and political information.

Deguara said he’s in favor of reducing burdens on small companies for certain reporting restrictions, but he opposes any measure to remove references to high-risk process for small companies. “While I absolutely support making compliance more manageable — especially for smaller businesses — it is important to keep in mind that the GDPR is based on a risk-based approach.”

“The size of the organization does not necessarily reflect the level of risk presented by the processing,” he said.

“In fact, even small or micro businesses can engage in processing that poses significant risks to the rights and freedoms of the data subjects.”

— Impact assessment —

Deguara also strongly opposes any measure that would remove the requirement for a Data Protection Impact Assessment. “A DPIA is not just a procedural requirement — it is a core accountability tool under the GDPR.”

“While I support efforts to make GDPR compliance more manageable — particularly for smaller organizations — simplification must never come at the cost of removing any meaningful safeguards,” he said. “DPIAs play a crucial role in embedding data protection into business practices and ensuring that data subjects’ rights are respected in high-risk scenarios.”

His views are in line with those of Bertrand du Marais of France's Commission Nationale de l'Informatique et des Libertés, who said that changing the obligations could harm consumers (see here).

* Privacy Symposium 2025, Venice, May 12-16, 2025.