← Back to blog
9 minInovateAI Team

GDPR and AI in EU operations: a compliance primer

Most EU businesses hesitate on AI because of compliance fears. Here's what actually matters: data residency, lawful basis, DPAs, sub-processors, erasure, audit logs — and how the EU AI Act lands.

Almost every EU buyer we talk to wants to deploy AI into operations and is held back by some version of the same conversation: "Legal won't sign off." Sometimes that's a real obstacle. More often, it's a fog of half-understood requirements that nobody on either side of the table has actually mapped out. This post is the map.

We're not lawyers, this is not legal advice, and your DPO is the ultimate authority on your specific deployment. But after deploying AI Digital Workers into European businesses for the better part of a year, the same six topics come up every time. Here's what actually matters and what you can stop worrying about.

1. Data residency — where the bytes live

The single most common objection: "We can't send our data to a US-hosted AI service." This is not a GDPR text requirement per se — GDPR allows international transfers under specific safeguards (Standard Contractual Clauses, adequacy decisions, the EU-US Data Privacy Framework). But the practical reality after Schrems II, Microsoft Ireland, and a procession of CNIL and DPC enforcement actions is that EU-hosted is the path of least resistance. If your data never leaves the EU, you cut 80% of the legal review.

InovateAI runs all customer data processing on EU infrastructure (Germany and Ireland primary). Customer data is not used to train foundation models. We use EU-region endpoints for the underlying model providers wherever those endpoints exist, and we contractually guarantee data residency in our DPA.

2. Lawful basis (Article 6) and the role question (Article 28)

Two questions every deployment must answer:

What's the lawful basis for processing? For most B2B operational use cases — invoice processing, support triage, lead enrichment — the basis is either contract (Art. 6(1)(b), you're processing your own customer's data to deliver the service they paid for) or legitimate interest (Art. 6(1)(f), routine operational processing of your business contacts). Special category data and consumer-facing flows are different and need consent (Art. 6(1)(a)).

Who is the controller, who is the processor? When InovateAI processes your customers' data on your behalf, you are the data controller and we are the data processor under Article 28. That triggers the requirement for a written Data Processing Agreement, which we provide as standard with every deployment. The DPA defines what we can and can't do with the data, how long we hold it, and what happens when the contract ends.

3. The DPA, the sub-processors, and the surprise vendors

The Article 28 DPA is non-negotiable. What buyers often forget to ask is the sub-processor question. Modern AI services rely on a stack — the model provider (Anthropic, OpenAI, Mistral), the cloud host (AWS, GCP, Azure), the vector store, the observability tooling. Each of those is a sub-processor in GDPR terms, and your DPA must name them, your buyer's DPO must approve them, and you must notify when the list changes.

Hidden sub-processors are the single most common source of enforcement risk in AI deployments. When you evaluate any AI vendor, ask for the full sub-processor list before you sign. If they can't produce one in 24 hours, that tells you what you need to know. Ours is published and updated on every change.

4. The right to erasure (and what "delete" actually means)

Under Article 17, a data subject can ask you to erase their personal data, and you have to do it (with documented exceptions). For traditional SaaS this is mostly a database problem. For AI systems, there are three places data can hide:

  • Operational store — the database, vector index, cache, and logs. Deletable on request.
  • Conversation history / agent memory — also deletable, but only if the system was designed with per-subject partitioning. If the agent stores everything in one global memory, erasure is harder.
  • Foundation model training data — this is where people get nervous. The answer: if customer data has not been used to train the model, the model has nothing to forget. Use a vendor (and configure it) so customer data never enters training.

InovateAI's architecture isolates each customer in a dedicated data space, with documented erasure routines, and customer data is contractually excluded from foundation model training.

5. Audit logs and the explainability question

GDPR Article 22 gives data subjects rights around solely automated decision-making with significant effects. For most operational AI use cases (invoice processing, ticket triage) this article doesn't bite — there's no significant effect on a data subject. But the spirit of the regulation, and the practical needs of any audit, demand full traceability of what the agent did and why.

Every action our Digital Workers take is logged with: input data, model version, prompt context, decision output, confidence score, and timestamp. If your auditor asks "why did the system do X on the 14th of March," you can answer with a structured record, not a shrug. This is non-negotiable for a serious EU deployment.

6. NDAs, security, and the basics that aren't basic

Encryption in transit (TLS 1.3) and at rest (AES-256) is table stakes and we provide both. Mutual NDAs are signed before any sensitive discovery work. Access to customer environments is least-privilege and audit-logged. Backups are EU-resident. SOC 2 and ISO 27001 progress is documented in our security packet, available on request before contract signature. None of this is glamorous; all of it is what your security review will ask for in week two.

How InovateAI is GDPR-by-design

To summarise the architecture:

  • Operated by Agenticas OÜ in Tallinn, Estonia — an EU member state with a strong data-protection track record.
  • EU-hosted infrastructure (Germany / Ireland) with no data egress to non-adequate jurisdictions.
  • Standard Article 28 DPA provided with every contract, with named sub-processors and 30-day change notification.
  • Per-customer isolation, encryption in transit and at rest, and customer data excluded from foundation model training.
  • Full action-level audit logs retained for the contractual period and exportable on demand.
  • Mutual NDA standard before any discovery work.

We've been through enough EU procurement reviews to have packaged the answers up front. See the privacy policy for the public version, or request the full security packet on the contact page.

The EU AI Act — what it means for Digital Workers

The EU AI Act came into force in 2024 and is being rolled out in phases through 2026 and 2027. The framework classifies AI systems into four risk tiers — unacceptable, high, limited, and minimal — and most operational AI Digital Workers (invoice processing, support triage, lead enrichment) sit comfortably in the limited or minimal risk category. The transparency obligations are straightforward: tell users they are interacting with an AI system when relevant; document the system; provide information to deployers.

The watch-out is the high-risk tier, which captures AI used in employment decisions, creditworthiness, essential services, and a handful of other domains. If you're considering deploying a Digital Worker into a workflow that touches one of these — automated CV screening, credit underwriting, eligibility decisions — the compliance burden steps up significantly. We will tell you that honestly during scoping and walk through the implications, including the conformity assessment, fundamental-rights impact assessment, and ongoing monitoring obligations.

For 90% of operational use cases, the AI Act is not a barrier. It's a documentation exercise that mature vendors handle out of the box. For the 10% that are high-risk, the right vendor will tell you so before you sign — not eighteen months in.

The honest summary

Compliance fear is the single most expensive thing happening in EU AI adoption right now. It costs companies the productivity they could have, and it costs them to competitors in non-EU markets where the regulatory weather is friendlier. The good news: most of the fear is fog. Pick an EU-resident, GDPR-by-design vendor with a real DPA, named sub-processors, audit logs, and contractual exclusion of customer data from training, and the legal review becomes a checklist, not a crisis.

If you'd like to see what a deployment looks like inside a properly scoped EU compliance envelope, book a 30-minute scoping call. We'll walk you through the DPA, the architecture, and the logging on the call itself. Pricing and terms are on the pricing page — flat €5,000 setup, €1,000 per month, 30-day cancellation, no minimum.


Continue reading

EU-hosted, GDPR-compliant, in production in 48–72h.

Operated by Agenticas OÜ from Tallinn, Estonia. Standard Article 28 DPA included.

Request scoping call →