Friday, January 23, 2026

Constructed for Agentic Scale and Cloud‑Native Apps


2025 was a pivotal 12 months in Azure Storage, and we’re heading into 2026 with a transparent give attention to serving to prospects flip AI into actual impression.

2025 was a pivotal 12 months in Azure Storage, and we’re heading into 2026 with a transparent give attention to serving to prospects flip AI into actual impression. As outlined in final December’s Azure Storage improvements: Unlocking the way forward for knowledge, Azure Storage is evolving as a unified clever platform that helps the complete AI lifecycle at enterprise scale with the efficiency fashionable workloads demand.

Looking forward to 2026, our investments span the complete breadth of that lifecycle as AI turns into foundational throughout each {industry}. We’re advancing storage efficiency for frontier mannequin coaching, delivering goal‑constructed options for giant‑scale AI inferencing and rising agentic functions, and empowering cloud‑native functions to function at agentic scale. In parallel, we’re simplifying adoption for mission‑vital workloads, decreasing TCO, and deepening partnerships to co‑engineer AI‑optimized options with our prospects.

We’re grateful to our prospects and companions for his or her belief and collaboration, and excited to form the subsequent chapter of Azure Storage collectively within the 12 months forward.

Extending from coaching to inference

AI workloads lengthen from giant, centralized mannequin coaching to inference at scale, the place fashions are utilized repeatedly throughout merchandise, workflows, and real-world resolution making. LLM coaching continues to run on Azure, and we’re investing to remain forward by increasing scale, enhancing throughput, and optimizing how mannequin information, checkpoints, and coaching datasets circulate by storage.

Improvements that helped OpenAI to function at unprecedented scale at the moment are out there for all enterprises. Blob scaled accounts enable storage to scale throughout tons of of scale items inside a area, dealing with hundreds of thousands of objects required to allow enterprise knowledge for use as coaching and tuning datasets for utilized AI. Our partnership with NVIDIA DGX on Azure exhibits that scale interprets into real-world inference. DGX cloud was co-engineered to run on Azure, pairing accelerated compute with high-performance storage, Azure Managed Lustre (AMLFS), to help LLM analysis, automotive, and robotics functions. AMLFS offers the very best price-performance for conserving GPU fleets repeatedly fed. We not too long ago launched Preview help for 25 PiB namespaces and as much as 512 GBps of throughput, making AMLFS greatest in school managed Lustre deployment on Cloud.

As we glance forward, we’re deepening integration throughout standard first and third-party AI frameworks equivalent to Microsoft Foundry, Ray, Anyscale, and LangChain, enabling seamless connections to Azure Storage out of field. Our native Azure Blob Storage integration inside Foundry allows enterprise knowledge consolidation into Foundry IQ, making blob storage the foundational layer for grounding enterprise data, fine-tuning fashions, and serving low-latency context to inference, all beneath the tenant’s safety and governance controls.

From coaching by full-scale inferencing, Azure Storage helps all the agent lifecycle: from distributing giant mannequin information effectively, storing and retrieving long-lived context, to serving knowledge from RAG vector shops. By optimizing for every sample end-to-end, Azure Storage has performant options for each stage of AI inference.

Evolving cloud native functions for agentic scale

As inference turns into the dominant AI workload, autonomous brokers are reshaping how cloud native functions work together with knowledge. Not like human-driven techniques with predictable question patterns, brokers function repeatedly, issuing an order of magnitude extra queries than conventional customers ever did. This surge in concurrency stresses databases and storage layers, pushing enterprises to rethink how they architect new cloud native functions.

Azure Storage is constructing with SaaS leaders like ServiceNow, Databricks, and Elastic to optimize for agentic scale leveraging our block storage portfolio. Trying ahead, Elastic SAN turns into a core constructing block for these cloud native workloads, beginning with reworking Microsoft’s personal database options. It presents totally managed block storage swimming pools for various workloads to share provisioned assets with guardrails for internet hosting multi-tenant knowledge. We’re pushing the boundaries on max scale items to allow denser packing and capabilities for SaaS suppliers to handle agentic visitors patterns.

As cloud native workloads undertake Kubernetes to scale quickly, we’re simplifying the event of stateful functions by our Kubernetes native storage orchestrator, Azure Container Storage (ACStor) alongside CSI drivers. Our current ACStor launch indicators two directional adjustments that can information upcoming investments: adopting the Kubernetes operator mannequin to carry out extra advanced orchestration and open sourcing the code base to collaborate and innovate with the broader Kubernetes group.

Collectively, these investments set up a robust basis for the subsequent era of cloud native functions the place storage should scale seamlessly and ship excessive effectivity to function the information platform for agentic scale techniques.

Breaking value efficiency obstacles for mission vital workloads

In addition to evolving AI workloads, enterprises proceed to develop their mission vital workloads on Azure.

SAP and Microsoft are partnering collectively to broaden core SAP efficiency whereas introducing AI-driven brokers like Joule that enrich Microsoft 365 Copilot with enterprise context. Azure’s newest M-series developments add substantial scale-up headroom for SAP HANA, pushing disk storage efficiency to ~780k IOPS and 16 GB/s throughput. For shared storage, Azure NetApp Recordsdata (ANF) and Azure Premium Recordsdata ship the excessive throughput NFS/SMB foundations SAP landscapes depend on, whereas optimizing TCO with ANF Versatile Service Degree and Azure Recordsdata Provisioned v2. Coming quickly, we’ll introduce Elastic ZRS storage service degree in ANF, bringing zone‑redundant excessive availability and constant efficiency by synchronous replication throughout availability zones leveraging Azure’s ZRS structure, with out added operational complexity.

Equally, Extremely Disks have turn out to be foundational to platforms like BlackRock’s Aladdin, which should react immediately to market shifts and maintain high-performance beneath heavy load. With common latency nicely beneath 500 microseconds, help for 400K IOPS, and 10 GB/s throughput, Extremely Disks allow sooner danger calculation, extra agile portfolio administration, and resilient efficiency on BlackRock’s highest-volume buying and selling days. When paired with Ebsv6 VMs, Extremely Disks can attain 800K IOPS and 14 GB/s for essentially the most demanding mission vital workloads. And with versatile provisioning, prospects can tune efficiency exactly to their wants whereas optimizing TCO.

These mixed investments give enterprises a extra resilient, scalable, and cost-efficient platform for his or her most important workloads.

Designing for brand new realities of energy and provide

The worldwide AI surge is straining energy grids and {hardware} provide chains. Rising power prices, tight datacenter budgets, and industry-wide HDD/SSD shortages imply organizations can’t scale infrastructure just by including extra {hardware}. Storage should turn out to be extra environment friendly and clever by design.

We’re streamlining all the stack to maximise {hardware} efficiency with minimal overhead. Mixed with clever load balancing and cost-effective tiering, we’re uniquely positioned to assist prospects scale storage sustainably whilst energy and {hardware} availability turn out to be strategic constraints. With continued improvements on Azure Enhance Knowledge Processing Models (DPUs), we anticipate step perform beneficial properties in storage velocity and feeds at even decrease per unit power consumption.

AI pipelines can span on-premises estates, neo cloud GPU clusters, and cloud, but many of those environments are restricted by energy capability or storage provide. When these limits turn out to be a bottleneck, we make it straightforward to shift workloads to Azure. We’re investing in integrations that make exterior datasets top quality residents in Azure, enabling seamless entry to coaching, finetuning, and inference knowledge wherever it lives. As cloud storage evolves into AI-ready datasets, Azure Storage is introducing curated, pipeline optimized experiences to simplify how prospects feed knowledge into downstream AI companies.

Accelerating improvements by the storage associate ecosystem

We will’t do that alone. Azure Storage companions intently with strategic companions to push inference efficiency to the subsequent degree. Along with the self-publishing capabilities out there in Azure Market, we go a step additional by devoting assets with experience to co-engineer options with companions to construct extremely optimized and deeply built-in companies.

In 2026, you will note extra co-engineered options like Commvault Cloud for Azure, Dell PowerScale, Azure Native Qumulo, Pure Storage Cloud, Rubrik Cloud Vault, and Veeam Knowledge Cloud. We are going to give attention to hybrid options with companions like VAST Knowledge and Komprise to allow knowledge motion that unlocks the ability of Azure AI companies and infrastructure—fueling impactful buyer AI Agent and Utility initiatives.

To an thrilling new 12 months with Azure Storage

As we transfer into 2026, our imaginative and prescient stays easy: assist each buyer unlock extra worth from their knowledge with storage that’s sooner, smarter, and constructed for the long run. Whether or not powering AI, scaling cloud native functions, or supporting mission vital workloads, Azure Storage is right here that can assist you innovate with confidence within the 12 months forward.



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles