NetApp CEO: We’ll bring AI to your data

NetApp CEO George Kurian

LAS VEGAS — To George Kurian, this moment in the evolution of artificial intelligence looks a lot like the early days of the cloud.

From his perspective as CEO of data storage giant NetApp, the big problem of the early days of the cloud was that companies’ data was spread out between on-premise storage and various cloud options. There was a noticeable “chasm” between those silos of storage that kept companies from getting the most out of moving to the cloud.

“We told you we’d work with the leaders in the industry to make a data fabric that makes it possible, easy, and seamless to use your own storage or any public cloud for any of your data,” he told attendees of this year’s NetApp Insight event here Monday afternoon. “We would build that bridge across the chasm.”

And now, Kurian said his customers are looking into a similar chasm between the data fabric that houses the company’s data and the largely proprietary high-end hardware on which AI applications live. 

“A big part of the AI challenge is a data challenge. How do I find the right data, how do I govern sensitive data by use or by purpose, how do I ensure it’s accurate and fresh so my applications and my customers get the right information, the latest information,” Kurian said.

He argued that AI is complicated, costly and time-consuming, at least in part because it’s being implemented in silos by big companies, with specialized infrastructure that’s kept entirely separate from “the data that forms the basis of the intelligence of your organization.”

So, NetApp will borrow from its cloud playbook. Kurian said the company plans to deliver “the best intelligent infrastructure for AI, bar none” over the next few years.

The keystone of that strategy is “bringing AI to your data,” wherever that data may be, by putting more intelligence and deeper hooks for AI apps into the company’s storage offerings. The goal is to “seamlessly deploy AI on data in place on-premise, in the cloud, and all points in between,” Kurian said.

The company will also work to offer greater options for scaling infrastructure and performance as needed for AI apps to do their work more effectively and offering “policy-based classification, privacy and security” that will follow customers’ data wherever it may go and ensure AI apps are exposed to updated data in a timely fashion.

“We’re building the best infrastructure for AI, and we’re going to build a robust data engine for AI and great data intelligent data services for AI,” Kurian said. “Just like in the cloud, we’ll bridge that chasm between AI and your data by bringing AI to your data.”

Kurian said the opportunity for doing so is massive, citing figures that fully realizing AI could lead to a 50 percent boost in the efficiency of automation in businesses and a 60 percent increase in the benefits of data analysis and insight. And that’s all backed by what he estimated at a $7.9 trillion (U.S.) bump in worldwide productivity over the next decade due to AI.

“That is astounding, and to reach that, we need not only the top corporations to be taking advantage of AI, but every organization, every school, every hospital, in every part of the world, needs to have access to these tools,” Kurian said.

Robert Dutt

Robert Dutt is the founder and head blogger at ChannelBuzz.ca. He has been covering the Canadian solution provider channel community for a variety of publications and Web sites since 1997. 

Leave a Reply

Your email address will not be published. Required fields are marked *