Sovereign by Architecture: Building AI Infrastructure for the EU AI Act
The EU AI Act takes effect August 2026. Compliance starts at the infrastructure layer. Learn why sovereign AI needs OpenStack, Kubernetes, and Atmosphere.
Insights, updates, and stories from our team
The EU AI Act takes effect August 2026. Compliance starts at the infrastructure layer. Learn why sovereign AI needs OpenStack, Kubernetes, and Atmosphere.
Learn how a lightweight keystoneauth1 plugin brings your existing browser-based MFA and SSO to the OpenStack CLI, with no changes to any client tools.
Hyperscaler AI looks fast but hides long-term lock-in and rising costs. See how OpenStack and Kubernetes deliver GPU infrastructure you actually control.
The EU AI Act takes effect August 2026. Compliance starts at the infrastructure layer. Learn why sovereign AI needs OpenStack, Kubernetes, and Atmosphere.
On August 2, 2026, the EU AI Act's most consequential provisions take effect, covering high risk AI systems in biometrics, critical infrastructure, employment, law enforcement, and more. Penalties reach up to €35 million or 7% of global turnover. The clock is running.
Most organizations are treating this as a legal problem. It's not. It's an infrastructure problem.
The regulation demands documented data governance, automatic logging, and full auditability, not just at the model layer, but across the stack beneath it. These aren't obligations you satisfy with a policy document. They live in how compute is provisioned, where data resides, and whether the platform can actually be inspected.
Yet most AI workloads today run on hyperscaler platforms where the control plane is closed, storage is opaque, and legal jurisdiction follows the provider, not the data center location. Selecting "EU Region" doesn't change who can be compelled to hand over access.
Sovereignty can't be configured after the fact. It must be designed into the foundation, through open, auditable platforms like OpenStack and Kubernetes, deployed on infrastructure the organization controls.
The EU AI Act is not just about how AI models behave. It is about how the systems beneath them are built, logged, and governed. Three articles in particular create direct infrastructure obligations.
Article 10 - Data Governance
High risk AI systems must be developed using high quality datasets for training, validation, and testing, managed properly across collection, preparation, bias mitigation, and gap analysis. This is not a documentation exercise. It requires infrastructure that tracks data lineage, preserves dataset versions, and can prove under audit what data a model was trained on and where it resided at every stage.
Article 12 - Record Keeping
Article 12 requires that high risk AI systems allow for the automatic recording of events while the system is operating. The keyword is automatic. Not documented after the fact. Not reconstructed from memory. Automatic. This means infrastructure that captures every prediction, every input, every decision, and every outcome. Not sampling. Not the best effort. Every event.
Article 13 - Transparency
Article 13 requires systems to include clear instructions for deployers and to be designed in a way that enables interpretation of outputs. If the infrastructure running the AI system is a black box, with closed control planes, opaque networking, and proprietary storage, meaningful transparency is not achievable.
The common thread is simple. Compliance requires far more than updated policies. Technical documentation, logging, and audit trail infrastructure are becoming prerequisites for operating in the EU market.
These requirements do not live in application code. They live in how infrastructure is architected. Who controls the storage. Who can access the logs. Whether every layer of the stack can be independently audited. If the platform is closed, compliance rests on a foundation that cannot be verified.
Most organizations assume that selecting a European region in their cloud provider's console solves the data residency problem. It does not.
The issue is not where the servers are. It is who controls them.
The US CLOUD Act of 2018 allows American authorities to compel US based technology companies to provide data, regardless of where that data is stored. This means data held in the EU can still be accessed under US law, even when it belongs to non- US citizens.
This creates a direct conflict with EU regulation. The CLOUD Act enables access to data stored in the EU, while GDPR explicitly restricts such transfers. Article 48 of the GDPR states that court orders from third countries are only valid if based on an international agreement such as a Mutual Legal Assistance Treaty. The CLOUD Act bypasses these agreements entirely.
The result is structural, not theoretical. Organizations operating on US owned infrastructure are forced into a bind. Comply with GDPR and risk violating US law, or comply with US law and breach EU regulation.
US hyperscalers have responded with sovereign cloud offerings, promising EU data boundaries, enhanced controls, and isolated regions. These measures may improve security, but they do not resolve the core issue. US law still applies to US controlled providers. Jurisdiction does not change with region selection.
As a result, the conversation is shifting. It is no longer about where data is stored, but who has legal authority over the infrastructure. For AI systems governed by the EU AI Act, where training data, logs, and inference records must be auditable, traceable, and provably resident, this gap is not a nuance. It is a compliance risk.
The only durable solution is infrastructure operated outside US legal jurisdiction, built on open -source platforms that can be fully audited, and deployed on hardware the organization directly owns or governs.
The EU AI Act is the headline regulation. But it does not operate in isolation.
Three major EU regulations have turned sovereign cloud from a preference into a hard compliance requirement for specific industries and workload types. They overlap, they reinforce each other, and they all point to the same conclusion. You need to control your infrastructure.
DORA (Digital Operational Resilience Act)
DORA entered full enforcement in January 2025 and targets financial entities. It standardizes ICT risk management and operational resilience across financial institutions and their critical providers.
It requires tested failover, clearly defined recovery time and recovery point objectives, threat led testing where applicable, and accurate registers of third -party ICT dependencies.
For organizations running AI in financial services, this means every GPU cluster, every training pipeline, and every inference endpoint must be documented, tested, and recoverable under stress. Infrastructure is now part of regulatory scope.
NIS2 (Network and Information Security Directive)
NIS2 applies to essential and important sectors such as energy, healthcare, transport, and digital infrastructure.
It enforces auditable cybersecurity baselines across critical services. This includes identity and access control, tamper evident logging, supply chain risk management, and strict incident reporting timelines.
The key requirement is demonstrability. It is not enough to claim security. You must provide telemetry, logs, and change history that an auditor can independently verify.
The Data Act
The Data Act introduces enforceable requirements around data access, portability, and cloud switching.
Organizations must ensure they can move data and workloads between providers without technical or financial barriers. This includes contractual obligations around exit terms, data portability, and the elimination of switching fees by January 2027.
Compliance across the EU AI Act, DORA, NIS2, and the Data Act is not achieved by adding features to an existing cloud deployment. It requires infrastructure that is sovereign, auditable, and portable by design.
This is not a feature checklist. It is a set of architectural constraints. If your platform cannot meet them natively, it is not compliant. It is compensating.
Here is what that means in practice.
Open source at every layer
If auditors cannot inspect the control plane, they cannot verify compliance. Proprietary platforms require trust in vendor documentation. Open- source platforms allow direct verification. Every API, scheduling decision, and access control policy must be traceable to code that can be inspected.
Jurisdictional control, not just data residency
Data must reside in a known jurisdiction, but the entity operating the infrastructure must also fall outside conflicting legal frameworks. Compliant infrastructure aligns operator, hardware, and legal governance within the same jurisdiction.
Immutable, automatic logging
Articles 10 and 12 of the EU AI Act require complete records, not best effort approximations. Infrastructure must capture events automatically, store them immutably, and expose them without reconstruction. If logs can be altered, delayed, or selectively enabled, they are not compliant. They are forensic artifacts, not audit evidence.
Documented exit strategy
DORA and the Data Act require that organizations can leave a provider without losing data, functionality, or continuity. This eliminates proprietary forks, vendor specific extensions, and storage formats that only function within a single ecosystem. Every component must be replaceable.
Workload portability
Compliance is not static. Regulations evolve and jurisdictions shift. Infrastructure must support moving workloads, including training pipelines, inference endpoints, and model registries, between environments without reengineering.
A managed AI service running on a closed control plane, with opaque storage and region-based isolation, cannot prove data lineage, cannot guarantee jurisdictional control, and cannot expose full system logs. It may function. It is unlikely to pass audit.
In architectural terms, this resolves into a clear mode:
OpenStack governs compute, storage, networking, and identity through open and auditable APIs. It defines where resources physically reside and who can access them, forming the sovereign infrastructure layer.
Kubernetes orchestrates AI workloads through declarative, CNCF standard APIs. It enables training jobs, inference pipelines, and services to move across environments without vendor dependency.
Together, they form a stack that is sovereign at the foundation, portable at the workload layer, and auditable end to end. This is exactly what the converging EU regulatory landscape requires.
This architecture is not theoretical. VEXXHOST already delivers this today.
Global infrastructure, deployed where sovereignty demands it.
VEXXHOST operates data centers in Montreal, Santa Clara, and Amsterdam, with private cloud deployments worldwide across on-premise and hosted models. This footprint lets organizations place AI infrastructure in the jurisdiction compliance requires, not where a provider restricts them. Whether that means EU-resident data in Amsterdam, an on-premise deployment, or a hosted private cloud in a location of your choice, VEXXHOST delivers the same platform, support, and open-source stack everywhere. The infrastructure follows your compliance requirements, not the other way around.
100% open source. Fully auditable
Atmosphere is built on upstream OpenStack and CNCF certified Kubernetes. There are no proprietary forks and no vendor specific extensions. Every component, from the control plane to the networking layer, is inspectable and traceable to upstream code. When auditors ask how a decision was made, the answer is in the codebase, not behind a vendor NDA.
Certified and conformant
VEXXHOST holds both OpenStack Powered certification and CNCF Kubernetes certification, having passed full conformance testing. When regulators or auditors ask whether the platform meets recognized standards, the answer is already established.
Compliance controls built in, not bolted on
SOC 2 and PCI DSS controls are embedded from day one. Immutable audit logs capture every action automatically. When an auditor asks who did what and when, the answer takes minutes, not weeks. This directly supports Article 12 of the EU AI Act and aligns with DORA’s operational resilience requirements.
Deploy where sovereignty demands it
Atmosphere supports on-premise, colocation, and hosted deployment models. You can run it on your own hardware, operate it within your own data center, or use a hosted private cloud with sovereignty guarantees. The platform remains consistent. The operational model adapts to compliance requirements. If you want to learn more about this topic, we encourage you to read this blog post.
AI ready from the ground up
Atmosphere supports GPU passthrough for full hardware performance, high performance networking with SR IOV and DPDK, and advanced configurations such as vGPU, MIG, and PCI passthrough across virtual machines, bare metal, and Kubernetes. This is not a general purpose cloud adapted for AI. It is infrastructure designed for the compute intensive, data heavy workloads that the EU AI Act directly targets.
If you want to learn more about how VEXXHOST can also help you meet the GDPR requirements for AI workloads, we highly encourage you read this blog post.
The EU AI Act does not ask where your data is. It asks who controls it and whether your infrastructure can be audited.
No hyperscaler setting solves that. Compliance is an infrastructure decision.
OpenStack for control. Kubernetes for portability. Atmosphere delivers both, ready before August 2, 2026.
The deadline is fixed. Start now.
Choose from Atmosphere Cloud, Hosted, or On-Premise.
Simplify your cloud operations with our intuitive dashboard.
Run it yourself, tap our expert support, or opt for full remote operations.
Leverage Terraform, Ansible or APIs directly powered by OpenStack & Kubernetes