Requesty
Back|MAY '26SECURITY / BEST PRACTICES
12 MIN READ|

EU Compliant AI Routing: Why Your LLM Gateway Needs to Be GDPR and EU AI Act Ready

Thibault Jaigu
Thibault Jaigu
CEO & Co-Founder
Published

On August 2, 2026, the EU AI Act's full enforcement kicks in. Penalties reach 35 million euros or 7% of global annual turnover, whichever is higher. If your AI system touches EU residents, every API call becomes a compliance event.

Most teams pick an LLM router and never think about where their requests actually go. That is a problem. Routers that run on global edge networks, like OpenRouter on Cloudflare, give you speed but no audit trail of which country processed your data. For teams subject to GDPR and the EU AI Act, that is not an acceptable trade off.

This post breaks down the compliance requirements, explains why edge based routing creates legal exposure, and shows how Requesty's EU infrastructure solves every problem on the checklist.

The regulatory landscape in 2026

Two regulations now govern how you route AI traffic involving EU data.

GDPR: The data residency foundation

GDPR has been in force since 2018, but supervisory authorities are now applying it specifically to AI systems. The key articles that affect LLM routing:

ArticleRequirementWhat it means for AI routing
Article 5(1)(c)Data minimizationDo not log prompts beyond operational necessity
Article 28Processor obligationsYour LLM router is a data processor and needs a DPA
Article 30Records of processingYou must document every processing activity, including which infrastructure processed each request
Article 44Transfer restrictionsPersonal data cannot leave the EU without an adequate legal basis
Article 48No foreign court transfersYou cannot hand over EU personal data to a non EU authority based solely on their court order

The March 2026 EDPB Coordinated Enforcement Action made this concrete. Twenty five Data Protection Authorities across Europe started asking organizations a single question: can you document what personal data your AI agents processed, in which sessions, on what legal basis, and with what protections? Most teams running AI agents could not answer.

EU AI Act: The August 2026 deadline

The EU AI Act (Regulation 2024/1689) adds a second layer on top of GDPR:

  • Article 9: Risk management systems throughout the AI lifecycle
  • Article 12: Automatic event logging with minimum 6 month retention
  • Article 14: Human oversight mechanisms
  • Article 50: Transparency obligations for all AI systems

The Annex III high risk classification covers AI systems used in hiring, credit scoring, insurance, healthcare, and critical infrastructure. If your AI agents make decisions in any of these areas, full compliance is mandatory from August 2.

Even if your system is not high risk, the general purpose AI (GPAI) obligations from August 2025 already require providers to maintain technical documentation and cooperate with downstream deployers on compliance.

The problem with edge based AI routing

Edge networks are built for speed. Cloudflare has over 300 points of presence worldwide. When your application sends a request through an edge based AI router, the network routes it to the nearest node for minimum latency. That is great for performance. It is terrible for compliance.

No data residency guarantees

When you send a prompt through OpenRouter's standard API, the request hits Cloudflare's edge. Which edge node? You do not know. The network decides based on latency, load, and availability. Your prompt might be processed in Frankfurt. It might be processed in London, New York, or Singapore.

For GDPR purposes, every time your request touches infrastructure outside the EU, you have a cross border data transfer. Article 44 requires you to have a legal basis for that transfer, document it, and be prepared to justify it to a supervisory authority. If you cannot even tell which country processed your request, you cannot satisfy this requirement.

OpenRouter does offer EU in region routing for enterprise customers. But their standard tier, which is what most teams use, runs on Cloudflare's global edge with no data residency enforcement.

No per request audit trail

GDPR Article 30 requires records of processing activities. The EU AI Act Article 12 requires automatic event logging. Both demand that you can reconstruct what happened to a specific piece of data.

Edge based routers typically provide aggregate analytics: total requests, total tokens, total cost. What they do not provide is a per request log showing:

  • Which geographic region processed the request
  • Which model provider received the data
  • Whether any personal data was detected in the prompt
  • The full routing path from your application to the model and back

Without this information, you cannot answer the questions that a Data Protection Authority will ask during an audit. You cannot demonstrate compliance. You are relying on trust instead of evidence.

The CLOUD Act conflict

This is the structural problem that no amount of configuration can fix with a US based provider.

The US CLOUD Act of 2018 requires any company incorporated in the United States to produce data in its possession, custody, or control upon receiving a valid US government demand. It does not matter where the data is physically stored. A US headquartered company with servers in Frankfurt is still subject to CLOUD Act demands for data on those servers.

This creates a direct conflict with GDPR Article 48, which states that personal data cannot be transferred to a non EU authority based solely on a foreign court or administrative order.

For AI routing, the implications are clear:

  • OpenRouter is a US company. Even if they process your data in an EU region, they are subject to CLOUD Act demands.
  • Cloudflare is a US company. Their edge infrastructure, regardless of node location, is subject to CLOUD Act jurisdiction.
  • Choosing AWS Frankfurt or Azure Germany for your LLM provider does not solve the problem if the router in front of them is subject to US jurisdiction.

The Schrems II ruling from the Court of Justice of the European Union already invalidated the EU US Privacy Shield over exactly this conflict. The current EU US Data Privacy Framework attempts to address it, but legal scholars and the European Data Protection Board have expressed concerns about its durability. Planning your compliance strategy around a framework that may be challenged is not a defensible position.

What EU compliant AI routing actually requires

Based on the combined requirements of GDPR and the EU AI Act, an EU compliant AI router must provide:

RequirementWhy it matters
EU data residencyAll processing, storage, and logging must stay within EU borders
EU parented infrastructureThe entity controlling the infrastructure should be EU incorporated or use infrastructure not subject to US CLOUD Act jurisdiction
Per request audit loggingEvery request must be logged with enough detail to reconstruct the processing chain
PII detectionPersonal data in prompts must be detectable and optionally redactable before reaching the model provider
Model provider controlYou must be able to restrict which model providers receive your data and ensure they operate within compliant regions
DPA availabilityA Data Processing Agreement must be available that satisfies GDPR Article 28
SOC 2 certificationIndependent verification that security controls are in place and operating effectively
Zero data retention optionThe ability to process requests without storing prompt or completion content

How Requesty solves every requirement

Requesty was built for this problem. The EU infrastructure is not an afterthought bolted onto a global edge network. It is a dedicated stack running in Frankfurt, designed from the ground up for GDPR and EU AI Act compliance.

Dedicated EU endpoint

Swap one line of code. All your AI traffic stays in the EU.

Python
from openai import OpenAI
 
client = OpenAI(
    api_key="your_requesty_api_key",
    base_url="https://router.eu.requesty.ai/v1",  # EU endpoint
)
 
response = client.chat.completions.create(
    model="anthropic/claude-sonnet-4-5-20250514",
    messages=[{"role": "user", "content": "Analyze this contract."}]
)

The EU endpoint at router.eu.requesty.ai runs entirely on AWS eu-central-1 in Frankfurt, Germany. All request routing, logging, caching, and analytics stay within EU borders. No cross border transfers. No edge hops through non EU nodes.

For the Anthropic SDK, use the same endpoint without the /v1 suffix:

Python
import anthropic
 
client = anthropic.Anthropic(
    api_key="your_requesty_api_key",
    base_url="https://router.eu.requesty.ai",  # EU endpoint
)

Or set environment variables for tools like Claude Code:

Shell
export ANTHROPIC_BASE_URL=https://router.eu.requesty.ai
export ANTHROPIC_API_KEY=your_requesty_api_key

See the full EU Routing documentation for every SDK and framework.

EU only model inference

The EU endpoint handles Requesty's processing. But model inference, where the AI model actually runs, is a separate layer. By default, models can run anywhere their provider hosts them.

To guarantee end to end EU data residency, restrict your organization to EU region models only:

  1. Open the Model Library and switch to Table view
  2. Filter by EU regions: EU, FRANCECENTRAL, SWEDENCENTRAL, eu-central-1, eu-west-1
  3. Approve only EU hosted models

Available EU region models include:

ProviderEU RegionModels
AWS Bedrockeu-central-1, eu-west-1, eu-north-1Claude Sonnet, Claude Haiku, Llama, Mistral
Google Vertex AIeurope-west1, europe-west4, europe-central2Gemini Pro, Gemini Flash
Azure OpenAIfrancecentral, swedencentralGPT-4o, GPT-4 Turbo
MistralEU native (Paris)Mistral Large, Medium, Small
NebiusEU nativeLlama 3, DeepSeek

Once you enable Approved Models with only EU selections, any request to a non approved model is rejected. Your data never leaves the EU for inference.

EU failover that stays in the EU

Production systems need failover. But if your fallback model is hosted in the US, your failover just became a GDPR violation.

Requesty's routing policies let you build failover chains that stay entirely within EU infrastructure:

Text
Policy: eu-compliant-failover
├─ bedrock/claude-sonnet-4-5-v2@eu-central-1 (2 retries)
├─ bedrock/claude-sonnet-4-5-v2@eu-west-1 (2 retries)
└─ bedrock/claude-3-5-haiku@eu-central-1 (1 retry)

If Claude Sonnet in Frankfurt goes down, traffic fails over to Claude Sonnet in Ireland, then to Claude Haiku in Frankfurt. At no point does your data leave the EU.

You can also set up EU load balancing to distribute traffic across EU regions:

Text
Policy: eu-balanced
├─ bedrock/claude-sonnet-4-5-v2@eu-central-1: 50%
└─ bedrock/claude-sonnet-4-5-v2@eu-west-1: 50%

Complete audit logging

Every request through Requesty is logged with:

  • Timestamp, model, provider, and region
  • Token counts (input, output, cached)
  • Latency and cost
  • Request metadata (custom tags you attach for tracking)
  • Guardrail results (PII detected, blocked prompts)

This satisfies GDPR Article 30 (records of processing), EU AI Act Article 12 (automatic event logging), and gives your DPO the evidence they need for supervisory authority inquiries.

Access your logs through Usage Analytics or export them programmatically through the API.

PII detection and scrubbing

Requesty's guardrails detect and redact personal data before it reaches any model provider:

  • Email addresses, phone numbers, social security numbers
  • IBANs, credit card numbers, financial identifiers
  • Custom regex patterns for domain specific data (internal IDs, proprietary formats)

The scrubbing happens within the EU infrastructure. The model provider never sees the raw personal data. This satisfies GDPR data minimization (Article 5(1)(c)) and gives you defense in depth even if a model provider has a breach.

SOC 2 Type II and DPA

Requesty is SOC 2 Type II certified, meaning an independent auditor has verified that security controls are not just designed but operating effectively over time. A Data Processing Agreement is available on request.

Zero data retention

For maximum data minimization, Requesty can operate in zero retention mode. Requests are proxied to the model provider in real time and nothing is stored, logged, or cached after delivery. Requesty never trains on your data.

Requesty vs OpenRouter: The EU compliance comparison

RequirementRequestyOpenRouter
Dedicated EU endpointrouter.eu.requesty.ai in FrankfurtEU routing for enterprise tier only
Standard tier EU residencyYes, available to all usersNo, standard tier uses global Cloudflare edge
EU only model restrictionYes, via Model Library approvalsManual provider selection, no enforcement
Per request audit logYes, full request logging with region, model, costAggregate analytics only
PII detection and scrubbingBuilt in guardrails (email, phone, IBAN, custom regex)None built in
SOC 2 Type IICertifiedNot publicly listed
DPA availableYes, on requestNot publicly available
CLOUD Act exposureEU infrastructure on AWS eu-central-1US company, Cloudflare (US) edge network
Failover within EUEU only routing policies with regional failoverNo EU constrained failover
Zero data retentionAvailableDepends on provider policies

A compliance checklist for your AI routing

Use this checklist to evaluate whether your current AI routing setup meets GDPR and EU AI Act requirements before the August 2 deadline:

Data Residency

  • All AI requests from EU users are processed within EU infrastructure
  • No edge network hops through non EU countries
  • Model inference runs on EU hosted infrastructure
  • Failover and fallback paths stay within the EU

Audit and Transparency

  • Per request logging captures region, model, provider, and timestamp
  • Logs are retained for minimum 6 months (EU AI Act Article 12)
  • You can reconstruct the processing chain for any individual request
  • PII detection results are logged per request

Data Protection

  • PII is detected and optionally redacted before reaching model providers
  • You have a signed DPA with your AI router provider
  • Your router provider is SOC 2 Type II certified
  • Zero data retention is available for sensitive workloads

Legal

  • Your router provider is not subject to CLOUD Act jurisdiction, or you have documented the risk and mitigation
  • Cross border data transfers are documented with appropriate safeguards
  • Your Records of Processing Activities include AI routing
  • Data Protection Impact Assessment covers your AI system

Getting started with EU compliant routing

The migration takes five minutes:

  1. Sign up at app.requesty.ai and get your API key from the API Keys page.

  2. Switch to the EU endpoint. Change your base URL to https://router.eu.requesty.ai/v1 (OpenAI compatible) or https://router.eu.requesty.ai (Anthropic compatible). Same API key, same request format.

  3. Restrict to EU models. Open the Model Library, filter by EU regions, and approve only EU hosted models.

  4. Set up EU failover. Create a routing policy with EU only models in your fallback chain.

  5. Enable guardrails. Turn on PII detection in your security settings to scrub personal data before it reaches any provider.

  6. Request your DPA. Contact the Requesty team through the EU page to get your Data Processing Agreement.

The bottom line

The August 2026 deadline is not theoretical. Supervisory authorities are already asking organizations to demonstrate how their AI systems handle personal data. Twenty five DPAs ran a coordinated enforcement action in March 2026 specifically targeting AI agent deployments.

Edge based routers that scatter your data across a global network cannot answer the questions these authorities are asking. You need infrastructure that gives you full control over where your data goes, complete audit trails of what happened to it, and the legal structure to back it up.

Requesty's EU infrastructure gives you all three. One line of code to switch. Full GDPR compliance. Full EU AI Act readiness. No compromises on model selection, performance, or cost.

Further reading

  • EU Routing Documentation — Full technical guide to Requesty's EU endpoint, EU model selection, and regional routing policies.
  • Requesty vs OpenRouter — Detailed platform comparison covering cost, security, analytics, and compliance.
  • Security and Compliance Checklist — SOC 2, HIPAA, and GDPR requirements for LLM gateways.
  • Guardrails — How to configure PII detection, prompt injection protection, and content policies.
  • Fallback Policies — Build EU constrained failover chains for production reliability.
  • Cost Tracking — Monitor and optimize AI spend with per request cost attribution.

Frequently asked questions

What is the EU AI Act August 2026 deadline?
On August 2, 2026, the EU AI Act's high risk provisions for Annex III systems take full effect. Organizations deploying AI systems that affect EU residents must demonstrate compliance with risk management, data governance, record keeping, transparency, and human oversight requirements. Penalties reach up to 35 million euros or 7% of global annual turnover.
Why is edge based AI routing a problem for GDPR compliance?
Edge networks like Cloudflare route requests to the nearest point of presence, which can be anywhere in the world. You have no guarantee that your data stays in the EU. When a request hops through a non EU node, you have a cross border data transfer that GDPR Article 44 requires you to document and justify. Most edge routers provide no audit trail of which node processed your request.
Does OpenRouter support EU data residency?
OpenRouter offers EU in region routing for enterprise customers, but their standard tier runs on Cloudflare's global edge network with no data residency guarantees. There is no built in per request audit log showing which region processed your data, which makes proving GDPR compliance to a supervisory authority difficult.
How does Requesty handle EU data residency?
Requesty runs a dedicated EU endpoint at router.eu.requesty.ai hosted in Frankfurt, Germany on AWS eu-central-1. All request routing, logging, caching, and analytics stay within EU borders. You can restrict model inference to EU regions only by approving only EU hosted models in the Model Library. SOC 2 Type II certified with DPA available on request.
What is the CLOUD Act problem with US based AI routers?
The US CLOUD Act of 2018 requires any US incorporated company to produce data in its possession upon receiving a valid US government demand, regardless of where that data is physically stored. This means even if a US provider hosts your data in Frankfurt, they can be legally compelled to hand it over. This creates a direct conflict with GDPR Article 48, which prohibits transferring personal data to non EU authorities based solely on a foreign court order.
Related reading