NetApp strengthens enterprise-grade data platform to accelerate modern AI workloads

Today, at their NetApp INSIGHT 2025 event, NetApp is making multiple announcements, the most significant of which is a package of new and enhanced solutions for their data platform that address a major customer pain point by ensuring that they get access to AI-ready data.

“As we talk about AI, the customer challenge that we’re constantly trying to solve is in helping customers provide AI-ready data for their data scientists,”  stated Jeff Baxter, VP of Product Marketing at NetApp. “Almost every different analyst group out there has different statistics, but there’s a Gartner statistic that they believe 60% of AI projects will be abandoned through the next year due to lack of AI-ready data. While there are many challenges, and many problems and opportunities in AI to solve, NetApp, as the intelligent data infrastructure company, thinks that at the core, providing AI-ready data is a problem, and is a challenge that we can help customers with.”

Baxter said they do that with what they now call the NetApp Data Platform.

“We’ve actually had a data platform for a long time,” he indicated. “I don’t think we’ve ever really formally called it the NetApp Data Platform, but starting at Insight, that’s what we’ll be calling it. It is a NetApp data platform that takes into account all of our unified storage across on-prem and cloud, with, data services built on top of it. We never really had a specific name for it. We had talked about the NetApp Data Fabric in the past, or a NetApp portfolio. There was never a name for the entire NetApp platform, so this is  us first instantiating that name.”

Beyond the rebranding, Baxter emphasized that the new solutions offerings are important,

“First, we are rolling out a new modern disaggregated storage system called NetApp AFX,” he said. Netapp AFX is built to be a large-scale disaggregated storage system that can scale out to exascale, it can scale to exabytes, and to massive performance. Out of the gate, it’s going to come out as NVIDIA SuperPod certified to be able to build AI factories on top of it.”

The new NetApp AFX decouples performance and capacity with a disaggregated NetApp ONTAP that runs on the new NetApp AFX 1K storage system. An enterprise-grade disaggregated all-flash storage system built for demanding AI workloads, NetApp AFX is a powerhouse data foundation for AI factories. AFX delivers the same robust data management and built-in cyber resilience that NetApp is known for, along with secure multi-tenancy and seamless integration across on-premises and cloud environments. AFX is designed for linear performance scaling up to 128 nodes with TBs per second of bandwidth, exabyte scale capacity, and independent scaling of performance and capacity. Optional DX50 data control nodes enable a global metadata engine for a real-time catalog of enterprise data and leverage NVIDIA accelerated computing.

“The key part of NetApp AFX for us is that it runs NetApp ONTAP, which is the same operating system we’ve been building on and improving for the last three decades, such that we have stated that it’s the most secure storage on the planet,” Baxter stressed. “AFX has got all of the enterprise data mobility features built in, it’s got all of the hybrid cloud connectivity built in, and it’s enterprise-proven and resilient across tens of thousands of customers. And now, using that exact same operating system, we’re now able to build a disaggregated storage backend onto it to get this massive sort of AI scale and linear granularity and expandability that people want for AI workloads, and sort of bring the best of both worlds. It’s one of these new AI-scale modern architectures, but using the proven enterprise-grade capabilities of NetApp ONTAP.”

AFX is a brand new offering, not a renaming or redoing of something NetApp had before.

“NetApp AFX is a brand new product that leverages NetApp ONTAP, so today we have NetApp AFF, which is our leading unified, all-flash storage,” Baxter said. “We also have again, using that same ONTAP, NetApp ASA, which is our block-optimized storage. So you can think of NetApp AFX as being our unstructured, optimized storage built for AI, for massive exascale AI workloads. It is a net new product, but it leverages 99% the same NetApp ONTAP software for the data management layer on top of this new disaggregated backend. The other important thing is that the NetApp AFX architecture, because it’s a disaggregated architecture, we can now bring accelerated computing powered by NVIDIA GPUs directly into the cluster, so they have high-speed direct access to the data storage without having to move that data storage. And we use those accelerated compute nodes to run a data service that we call the NetApp AI Data Engine.”

The NetApp AI Data Engine is the second big announcement at INSIGHT.

“The NetApp AI Data Engine is, essentially a software service, and its job is to really address some of those challenges that make 60% of these AI projects fail,” said Eric Schou, VP of Marketing at NetApp. “It’s really targeting some of those challenges, to really make some of these projects a lot more successful and repeatable. That 60% number has a lot to do with complexity. The AI Data Engine turns a lot of this complexity into clarity, by delivering a very simple, secure, intelligent, end-to-end AI data pipeline.”

The NetApp AI Data Engine is a secure, unified extension of ONTAP integrated with the NVIDIA AI Data Platform reference design that helps organizations simplify and secure the entire AI data pipeline – and managed via single, unified control plane. Together, these capabilities unify high-performance storage and intelligent data services into a single, secure, and scalable offering that accelerates enterprise AI retrieval augmented generation (RAG) and inference across hybrid and multicloud environments. Customers will be able to access these products through direct purchase or through a subscription to NetApp Keystone STaaS. With NetApp AFX and the NetApp AI Data Engine, the NetApp data platform is able to immediately ensure that all applicable data is immediately ready for AI.

“So this is not another point solution,” Schou emphasized. “This is fully integrated into ONTAP. It already leverages technologies, NetApp technologies, such as Snapshot, SnapDiff, Snap Mirror, and it enables those AI workflows in a way that really nobody else can do in market.”

“With the new NetApp AFX systems, customers now have a trusted, proven choice in on-premises enterprise storage built on a comprehensive data platform to rapidly propel AI innovation forward,” said Syam Nair, Chief Product Officer at NetApp. “NetApp AI Data Engine enables customers to seamlessly connect their entire data estate across hybrid multicloud environments to build a unified data foundation. Enterprises can then dramatically accelerate their AI data pipelines by collapsing multiple data preparation and management steps into the integrated NetApp AI Data Engine, built with NVIDIA accelerated computing and NVIDIA AI Enterprise software complete with semantic search, data vectorization and data guardrails. The combination of NetApp AFX with AI Data Engine provides the enterprise resilience and performance built and proven over decades by NetApp ONTAP, now in a disaggregated storage architecture, and all still built on the most secure storage on the planet.”

“The NetApp AI Data Engine is essentially a software service, and its job is to really address some of those challenges that turn into that 60% of these AI projects fail,” Schou said. “It’s really targeting some of those challenges, to really make some of these projects a lot more successful and repeatable. That 60% number has a lot to do with complexity. The AI data engine, really turns a lot of this complexity into clarity, by delivering a very simple, secure, intelligent, end-to-end AI data pipeline.

“So this is not another point solution,” he added. “This is fully integrated into ONTAP. It already leverages technologies, NetApp technologies, such as Snapshot, SnapDiff, Snap Mirror, and it enables those AI workflows in a way that really nobody else can do in market. And the AI data engine gives you a very unified global view of your entire NetApp data estate. Essentially that’s how it delivers clarity.

Schou indicated that there are three pieces to this NetApp AI Data Engine: the metadata engine, data guardrails, and data curator.

“The metadata engine gives you the ability to instantly find and understand your data,” he said. “So, pretty simple. No more hunting around various silos. The metadata engine is what gives you a very global, structured view of all of your NetApp data. It automatically scans your entire data state and gives you a very rich, interactive catalogue. The data guardrails ensure your AI is built on a foundation of trust and security from Day One. All this stuff is integrated and built in, not bolted on afterwards. This will automatically scan and classify your data, and identify potentially sensitive information like PII.

“Finally,  once you’ve found your data, the data curator gets it ready for AI,” Schou said. “This has been a really big challenge for customers. This is where we really tackle that data bloat. For GenAI apps, you need to turn your unstructured data into vector embeddings, and the data curator does this right at the storage layer, and when combined with our own advanced compression, it can actually reduce the size of that data set by up to 10x, which is substantial.

“So essentially, the NetApp Data Engine accelerates enterprise AI, and gives you the confidence by unifying discovery, curation, and governance,” Schou concluded. “And it’s all storage integrated, as I said before. And it’s really an end-to-end solution that transforms your unstructured data into secure, AI-ready data pipelines. The secret sauce here, and what differentiates NetApp, is the enterprise-grade capabilities that NetApp brings to bear. There’s a lot of small startup companies that have a lot of good marketing material online, but the amount of data that NetApp is currently managing and the length of time that we have been in this business, and the enterprise-grade-like, methodology, specifically around ONTAP, that we leverage in both AFX as well as the NetApp AI Data Engine really stands alone in how we’re approaching, AI, and specifically AI data pipelines.”

The Data Engine is also a new product that goes hand-in-glove with AFX.

“This is definitively a new product from NetApp,” Baxter said. “Many people build AI infrastructure typically on NetApp AFF, and they store their data there. But then they have to use multiple different third-party tools to provide a catalogue of that data, to potentially provide curation of that data, and then a separate tool to create the embeddings, and then to store a vector database. We had one customer where they were using 13 different tools to do this, and having to copy the data seven different times to do it. So there’s certainly tools out in the market to do it, not from NetApp, but out there in the market. What we’re able to do by integrating it all directly where the data is on this high-speed network is simplify the process dramatically. So it is a new area of the market that NetApp is entering. We haven’t really gone up to the AI data services layer before. We’ve primarily just provided the data storage for AI. This is the first true data service that we’re offering that targets providing not just data storage, but AI-ready data.”

The NetApp  Data Engine will run natively within the AFX cluster on top of the optional DX50 data control nodes. Future ecosystem support includes the integration of NVIDIA RTX PRO Servers featuring RTX PRO 6000 Blackwell Server Edition GPUs.

Other new solutions include Object API for Seamless Access to Azure Data and  AI Services: Customers will now be able to access their Azure NetApp Files data through an Object REST API, which is available in public preview. This new capability means customers no longer need to move or copy file data into a separate object store to use it with Azure services. Instead, NFS and SMB datasets can be connected directly to Microsoft Fabric, Azure OpenAI, Azure Databricks, Azure Synapse, Azure AI Search, Azure Machine Learning, and others. Customers can analyze data, train AI models, enable intelligent search, and build modern applications on their existing ANF datasets.

So what will all this mean for channel partners?

“NetApp is well known for having a robust and wide partner ecosystem,” Baxter said. “We do the vast majority of our business through the channel, and we greatly appreciate our channel partners, and they’ve been partners with us in building and launching many of these products. AFX and the AI Data Engine will certainly be available from the channel, and not just in the sense of, oh, it’s another thing to sell, but I think it dramatically improves or increases their total addressable market. There’s so many channel partners who are familiar with NetApp who are experts in NetApp ONTAP, who have been part of that ecosystem with us for, in many cases, decades. And this immediately unlocks the doors of these massive exascale AI factory opportunities  for our channel partners to participate in, whereas before they might have to learn an entirely new technology, or perhaps even sign a whole new partnership. Now they can just absorb this and immediately go into these massive AI factory deals. So it really is about opening up new opportunities and new gates for them.

“Many larger partners have incredibly well-built AI practices, and so they’re already conversant with and familiar with the ecosystem and able to provide that,” Baxter concluded. “But for smaller partners that may be more storage-focused, or at least more infrastructure-focused, the NetApp AI Data Engine gives them a way to engage with larger portions of their customers, and start to talk to more of the data practitioners and data scientists within those customers, and begin to provide greater and higher value-added services at a higher level in the stack, which can greatly expand their opportunity for business, their opportunity for professional services, and ultimately make them even more relevant to the customers that we jointly serve. It helps them start to bridge into these even larger ecosystem around AI, as well as start to bridge up into the data science and data prep layer, and provide solutions and capabilities there that they perhaps were not able to do before.”