AMD and Nutanix advance open and scalable enterprise AI platform

An open and scalable enterprise AI platform is set to emerge from a multi-year technology collaboration focused on inference, hybrid deployment and agentic applications. The initiative combines silicon, software and orchestration to simplify AI rollouts.

author-image
DQC Bureau
New Update
AMD Nutainix

AMD and Nutanix advance open and scalable enterprise AI platform

AMD and Nutanix have entered a multi year strategic partnership to jointly develop an open and scalable enterprise AI platform aimed at powering agentic AI applications across enterprises, service providers and edge environments.

Advertisment

The agreement brings together AMD’s silicon and AI software capabilities with Nutanix’s Cloud and Kubernetes platforms. The goal is to deliver production ready AI infrastructure that allows enterprises to deploy AI models without relying on vertically integrated stacks.

The first jointly developed agentic AI platform is expected to reach the market in late 2026.

Financial commitment signals long term intent

As part of the agreement, AMD will make a strategic investment of USD 150 million in Nutanix common stock at a purchase price of USD 36.26 per share. AMD will also fund up to USD 100 million to support joint engineering initiatives and go to market collaboration.

Advertisment

The equity investment is expected to close in the second quarter of 2026, subject to regulatory approvals and customary conditions.

This financial alignment strengthens the partnership beyond technology integration. It reflects a shared long term roadmap around enterprise AI infrastructure.

Full stack integration across hardware and software

The open and scalable enterprise AI platform will integrate:

  • AMD EPYC CPUs for high core density compute and orchestration

  • AMD Instinct GPUs for inference acceleration

  • AMD ROCm software ecosystem for open runtime support

  • Nutanix Cloud Platform and Nutanix Kubernetes Platform

  • Nutanix Enterprise AI for unified lifecycle management

Advertisment

By combining silicon, runtime software and Cloud orchestration, AMD and Nutanix aim to simplify AI deployment across datacentre, hybrid and edge environments.

The companies are positioning the platform as open by design. Enterprises will be able to run both open source and commercial AI models without dependency on closed AI stacks.

Focus shifts to inference and operational scale

Enterprise AI infrastructure is moving into an inference heavy phase. As organisations move from experimentation to production, infrastructure must deliver performance, efficiency and operational simplicity at scale.

Advertisment

AMD’s Instinct GPUs are expected to provide high performance inference acceleration. EPYC processors will support orchestration and high density compute environments. Nutanix will provide unified management across hybrid deployments.

The result is a co engineered platform designed for:

  • Enterprise AI agents

  • Multi modal inference services

  • Industry specific intelligent applications

Hybrid and edge as default architecture

AMD and Nutanix are designing the open and scalable enterprise AI platform to operate across hybrid Cloud, on premise datacentres and edge locations.

Advertisment

This reflects enterprise reality. Data often resides in multiple environments. AI workloads must follow the data, not the other way around.

By integrating AMD Enterprise AI capabilities into Nutanix full stack solutions, the companies aim to provide operational consistency across distributed environments.

Strategic positioning in an open ecosystem

Both companies emphasise an open ecosystem approach built on open standards and interoperable software frameworks. Architectural choice is positioned as a key requirement for enterprises investing in AI infrastructure.

Advertisment

The partnership also includes collaboration with a broad set of OEM partners, signalling ecosystem level alignment rather than a closed vendor model.

What this means for enterprises

For enterprises, the partnership between AMD and Nutanix signals three priorities:

  1. Greater flexibility in AI model deployment

  2. Infrastructure optimised for inference workloads

  3. Unified lifecycle management across hybrid environments

Advertisment

The real test will be execution. With the first platform expected in late 2026, the timeline suggests structured development rather than a rapid release.

As AI agents and inference workloads become foundational to enterprise computing, infrastructure strategy will matter as much as model performance. AMD and Nutanix are positioning this open and scalable enterprise AI platform as their answer to that shift.

Read More: 

90% of ransomware exploited firewalls in 2025: Barracuda Report

Snapdragon 8 Elite Gen 5 for Galaxy unveiled

Commvault SHIFT India Partner Summit concludes

nutanix amd