-->

Register Now to SAVE BIG & Join Us for Enterprise AI World 2025, November 19-20, in Washington, DC

NetApp and Intel Partner to Streamline the Deployment and Use of AI

NetApp, the intelligent data infrastructure company, is introducing NetApp AIPod Mini with Intel, a joint solution designed to streamline enterprise adoption of AI inferencing.

According to the companies, this collaboration addresses the challenges businesses face when deploying AI, such as cost and complexity, at the department and team level.

To thrive in the era of intelligence, enterprises have adopted AI to enhance efficiency and data-driven decision making across their businesses.

In response, NetApp and Intel partnered to provide businesses with an integrated AI inferencing solution built on an intelligent data infrastructure framework that allows specific business functions to leverage their distinct data to create outcomes that support their needs.

NetApp AIPod Mini streamlines the deployment and use of AI for specific applications such as automating aspects of document drafting and research for legal teams, implementing personalized shopping experiences and dynamic pricing for retail teams, and optimizing predictive maintenance and supply chains for manufacturing units.

“Our mission is to unlock AI for every team at every level without the traditional barriers of complexity or cost,” said Dallas Olson, chief commercial officer at NetApp. “NetApp AIPod Mini with Intel gives our customers a solution that not only transforms how teams can use AI but also makes it easy to customize, deploy, and maintain. We are turning proprietary enterprise data into powerful business outcomes.”

NetApp AIPod Mini enables businesses to interact directly with their business data through pre-packaged Retrieval-Augmented Generation (RAG) workflows, combining generative AI with proprietary information to deliver precise, context-aware insights that streamline operations and drive impactful outcomes.

By integrating Intel Xeon 6 processors and Intel Advanced Matrix Extensions (Intel AMX) with NetApp’s all-flash storage, advanced data management, and deep Kubernetes integration, NetApp AIPod Mini delivers high-performance, cost-efficient AI inferencing at scale, according to the vendors.

Built on an open framework powered by Open Platform for Enterprise AI (OPEA), the solution ensures modular, flexible deployments tailored to business needs. Intel Xeon processors are designed to boost computing performance and efficiency, making AI tasks more attainable and cost-effective, empowering customers to achieve more.

“A good AI solution needs to be both powerful and efficient to ensure it delivers a strong return on investment,” said Greg Ernst, Americas corporate vice president and general manager at Intel. “By combining Intel Xeon processors with NetApp’s robust data management and storage capabilities, the NetApp AIPod Mini solution offers business units the chance to deploy AI in tackling their unique challenges. This solution empowers users to harness AI without the burden of oversized infrastructure or unnecessary technical complexity.”

NetApp AIPod Mini with Intel will be available in the summer of 2025 from strategic distributors and partners around the world.

These initial launch partners will include NetApp distributor partners Arrow Electronics and TD SYNNEX as well as integration partners Insight Partners, CDW USA, CDW UK&I, Presidio and Long View Systems, who will provide dedicated support and service to ensure a seamless purchasing and deployment experience for customers’ unique AI use cases.

For more information about this news, visit www.netapp.com.

EAIWorld Cover
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues