-->

Register Now to SAVE BIG & Join Us for Enterprise AI World 2025, November 19-20, in Washington, DC

Akamai Introduces Akamai Cloud Inference to Meet AI at the Edge

Akamai, the cybersecurity and cloud computing company, is launching Akamai Cloud Inference, to mark a faster, more efficient wave of innovation for organizations looking to turn predictive and large language models (LLMs) into real-world action.

Akamai Cloud Inference runs on Akamai Cloud, a distributed platform, to address escalating limitations of centralized cloud models.

“Getting AI data closer to users and devices is hard, and it’s where legacy clouds struggle,” said Adam Karon, chief operating officer and general manager, cloud technology group at Akamai. “While the heavy lifting of training LLMs will continue to happen in big hyperscale data centers, the actionable work of inferencing will take place at the edge where the platform Akamai has built over the past two and a half decades becomes vital for the future of AI and sets us apart from every other cloud provider in the market.”

Akamai’s new solution provides tools for platform engineers and developers to build and run AI applications and data-intensive workloads closer to end users, according to the company. Akamai Cloud Inference includes:

  • Compute: Akamai Cloud offers a versatile compute arsenal, from classic CPUs for fine-tuned inference, to powerful accelerated-compute options in GPUs, and tailored ASIC VPUs to provide the right horsepower for a spectrum of AI inference challenges. Akamai integrates with Nvidia’s AI Enterprise ecosystem, leveraging Triton, TAO Toolkit, TensorRT, and NVFlare to optimize performance of AI inference on NVIDIA GPUs.
  • Data management: Akamai enables customers to unlock the full potential of AI inference with a cutting-edge data fabric purpose-built for modern AI workloads. Akamai has partnered with VAST Data to provide streamlined access to real-time data to accelerate inference-related tasks, essential to delivering relevant results and a responsive experience. This is complemented by highly scalable object storage to manage the volume and variety of datasets critical to AI applications, and integration with leading vector database vendors, including Aiven and Milvus, to enable retrieval-augmented generation (RAG).
  • Containerization: Containerizing AI workloads enables demand-based autoscaling, improved application resilience, and hybrid/multi-cloud portability, while optimizing both performance and cost. With Kubernetes, Akamai delivers faster, cheaper, and more secure AI inference at petabyte-scale performance. Underpinned by Linode Kubernetes Engine (LKE)-Enterprise, a new enterprise edition of Akamai Cloud’s Kubernetes orchestration platform designed specifically for large-scale enterprise workloads, and the recently announced Akamai App Platform, Akamai Cloud Inference is able to quickly deploy an AI-ready platform of open source Kubernetes projects, including KServe, Kubeflow, and SpinKube, seamlessly integrated to streamline the deployment of AI models for inference.
  • Edge compute: To simplify how developers build AI-powered applications, Akamai AI Inference includes WebAssembly (Wasm) capabilities. Working with Wasm providers like Fermyon, Akamai enables developers to execute inferencing for LLMs directly from serverless apps, allowing customers to execute lightweight code at the edge to enable latency-sensitive applications.

Together, these tools create a powerful platform for low-latency, AI-powered applications that allow companies to deliver the experience their users demand, the company said.

"Training an LLM is like creating a map, requiring you to gather data, analyze terrain, and plot routes. It’s slow and resource-intensive, but once built, it’s highly useful. AI inference is like using a GPS, instantly applying that knowledge, recalculating in real time, and adapting to changes to get you where you need to go,” explained Karon. “Inference is the next frontier for AI.”

For more information about this news, visit www.akamai.com

EAIWorld Cover
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues