Mirantis integrates open-source NVIDIA NCX Controller into k0rdent AI

0

At KubeCon + CloudNativeCon Europe 2026 in Amsterdam, Mirantis announced integration of the open-source NVIDIA NCX Infra Controller into its modular k0rdent AI platform. The solution supports NVIDIA Ampere, Hopper, and Blackwell GPU architectures along with network fabrics such as InfiniBand and Spectrum-X Ethernet. Collaboration with Netris, Supermicro, and VAST Data enables global multi-tenant cloud infrastructure deployments. The integration enhances scalable automation and validation processes, rapidly accelerating AI workload delivery.

Mirantis Announces NVIDIA NCX Infra Controller Support At KubeCon

At KubeCon + CloudNativeCon Europe 2026 in Amsterdam, Mirantis announced support for the open-source NVIDIA NCX Infra Controller. This integration simplifies automated provisioning and management of GPU-accelerated infrastructure within modular k0rdent AI platform. By combining validated reference architectures and Kubernetes-native orchestration, Mirantis delivers scalable, tenant-aware cloud solutions. its partner ecosystem featuring Netris, Supermicro, and VAST Data ensures seamless integration of networking, compute, and storage modules to realize multi-tenant deployments.

Mirantis evolves open-source private cloud towards AI-native modular architectures

Over the past decade, Mirantis has advanced robust open-source private cloud technologies, orchestrating OpenStack and Kubernetes deployments for global service providers. As industry demands shift from virtualization-centric infrastructures toward AI-native frameworks, Mirantis leverages its contributions to k0rdent, Kubernetes, and OpenStack to architect modular, multi-tenant cloud solutions. By adhering to open standards, the company fosters interoperable platforms that accommodate evolving machine learning workloads while ensuring scalability, reliability and secure resource isolation.

k0rdent AI acts as a “Metal-to-Model” orchestration layer, unifying NVIDIA Ampere, Hopper, and Blackwell GPU architectures with network modules such as Quantum InfiniBand and Spectrum-X Ethernet. By abstracting hardware complexities into reusable deployment patterns, it delivers validated, tenant-ready infrastructure stacks optimized for AI workloads. Kubernetes-native automation streamlines provisioning and scaling, while integrated system platform support for DGX, HGX, and MGX ensures consistent performance, and operational efficiency across demanding learning environments.

Mirantiss k0rdent AI solution delivers a validated Metal-to-Model stack engineered for scalable AI production facilities. A unified control plane coordinates NVIDIAs Ampere, Hopper, and Blackwell reference architectures alongside high-performance networking technologies such as Quantum InfiniBand and Spectrum-X Ethernet, plus system platforms like DGX, HGX, and MGX. The integrated environment streamlines deployment, providing multi-tenant cloud platforms with preconfigured, optimized settings tailored to demanding AI workloads, ensuring rapid time-to-value and operational consistency.

Integrating the NVIDIA NCX Infra Controller into k0rdent AI empowers organizations to accelerate deployment of AI infrastructure with Kubernetes-native automation. Declarative manifests replace manual configuration, enabling parallel provisioning of compute, networking, and storage. Automated lifecycle operations slash setup time, prevent configuration drift, and enforce policy. This streamlined process reduces deployment durations, boosts consistency, and delivers environments. Consequently, enterprises shift focus from repetitive infrastructure tasks to model development and AI workflows.

Integration of the NVIDIA NCX Infra Controller into k0rdent AI drives the AI Cloud Ready Initiative by enabling Kubernetes-native automation and infrastructure provisioning. Users gain access to scalable deployments using reproducible templates, eliminating manual integration efforts. This approach accelerates rollout and maintains consistency across environments. Mirantis CTO Shaun OMeara stresses that open standards are vital for turnkey AI solutions and asserts that partnering with NVIDIA boosts resource utilization and efficiency.

This collaborative partner ecosystem leverages Netris, Supermicro, and VAST Data to deliver scalable, repeatable AI deployment solutions. By integrating network virtualization, high-performance compute hardware, and storage architectures, the infrastructure components streamline lifecycle automation and configuration management. Cloud providers can rapidly provision validated AI clusters using predefined templates, ensuring consistency across multiple environments. As a result, organizations achieve performance, operations, and accelerated time-to-insight for demanding machine learning workloads at enterprise scale.

Mirantis has formed strategic collaborations with Netris, Supermicro and VAST Data to seamlessly integrate modular infrastructure components. This partner ecosystem empowers cloud providers to rapidly deploy validated AI infrastructure stacks, while leveraging automated lifecycle management to achieve predictable performance and optimized operational efficiency across environments. By embracing an open, modular design philosophy, the solution catalyzes innovation, accelerates time to market and delivers unprecedented agility and flexibility for evolving cloud deployments.

Mirantis unveils open AI platform integrating k0rdent and NCX

Mirantis offers an open modular AI infrastructure platform integrating k0rdent AI with NVIDIA NCX Infra Controller in cooperation with partners such as Netris, Supermicro, and VAST Data. Users leverage a validated metal-to-model architecture, Kubernetes-native automation, and reproducible deployment patterns to streamline operations across cloud environments. This solution drives efficient resource utilization, multi-tenant isolation, and rapid scaling. By emphasizing open standards and interoperability, Mirantis accelerates delivery of enterprise-grade AI cloud services.

Leave A Reply