Skip to main content

Understanding AI Transparency: Where the EU AI Act Meets GDPR

23 February 2025

Navigating the world of AI regulation can feel like a maze. Transparency is key, and two major pieces of legislation – the EU AI Act and GDPR – are essential guides. This article breaks down the core transparency requirements, focusing on what really matters for in-house counsel. The AI Act focuses on making sure people are aware when they're interacting with AI. Many of the specific transparency obligations under the AI Act are derived from Article 50, which focuses on information requirements for providers and deployers of AI systems.

Providers vs. Deployers: Who’s Who?

The AI Act (Article 3) makes a crucial distinction between providers (those who develop or significantly modify AI systems) and deployers (those who use them). Providers build the AI and deployers put it into action. Of course, some organisations wear both hats.

For Providers: A Deeper Dive

Developing AI? Here's what you need to focus on:

  • High-Risk AI: Think healthcare, transportation, etc. (Annex III). You'll need robust design and development measures, clear instructions, and registration in an EU database
  • Human Interaction: Design your AI so people know they're interacting with it.
  • Synthetic Content: Clearly mark anything that is AI-generated as such.
  • Data Disclosures: Be transparent about any personal data used for training, testing, or validation (Article 13 and 14 GDPR).

And don't forget the details: comprehensive instructions, contact information, performance limitations, and anything that could impact safety or fundamental rights (Annex IV). Essentially, you are required to equip deployers with what they need to understand and use your AI responsibly.

For Deployers: Your Responsibilities

Using AI? Here's your checklist:

  • High-Risk AI at Work: Tell your employees before you deploy it.
  • Synthetic Content: Disclose it. No surprises.
  • Emotion Recognition/Biometrics: Inform the people affected.
  • Data Disclosures: Update your privacy policies to describe any new personal data processing (Article 13 and14 GDPR).

GDPR: The Bigger Picture

GDPR's transparency requirements are broader, covering the processing of all personal data. Think clear language, information about purpose, retention, recipients, and data subject rights. It's about empowering individuals to control the use of their personal data.

Case Studies

The following examples are fictional scenarios designed to illustrate the application of the AI Act and GDPR. They do not represent real companies.

  • NeuralGuard Labs’: AI-powered alert system for a multinational corporation, monitoring public safety and security incidents from media sources. The system generates automated email and SMS alerts without human review.
    • Provider (NeuralGuard Labs): Must design the system to clearly mark alerts as AI-generated.
    • Deployer (multinational corporation): Must disclose to alert recipients that the alerts are AI-generated.

  • ProductivityAI’: High-risk productivity tracking and assessment tool for a corporate customer. It tracks employee time, task management, and performance, generating recommendations for workflow optimisation.
    • Provider (ProductivityAI): Must design the tool with transparency in mind, provide detailed instructions for use, register the tool in the EU database, and disclose information about personal data used for training.
    • Deployer (corporate customer): Must inform employees and their representatives about the tool's use before deployment, update privacy policies to reflect the use of personal data, and be transparent about how the AI is used for automated decision-making.

  • InnovateAI Solutions’: AI-powered virtual assistant for a financial services institution. It augments human operator responses to customer queries by understanding sentiment and generating response suggestions.
    • Provider (InnovateAI Solutions): Must design the system to clearly indicate to human operators when they are interacting with the AI assistant and that response suggestions are AI-generated.
    • Deployer (Financial Services Institution): Must update privacy policies to inform both operators and customers about the use of personal data for continuous learning and customer service optimisation.

There are other factors to consider in these case studies, including the potential impact on the employment relationship with operators using tools, requirements for impact assessments, contract clauses to be reviewed etc, all of which are beyond the scope of this article.

The Bottom Line

Both the AI Act and GDPR emphasise transparency, but they approach it from different angles. The AI Act is focused on AI-specific interactions, while GDPR provides a broader data protection framework. Understanding both is crucial for staying compliant and building trust.

Need help navigating AI regulation? Data Driven Legal specialises in data protection and AI governance. Contact us today for practical, straight-forward advice.