Back to Resources

Blog | Aug 22, 2025

Agentic AI and the Evolution of Networking

Agentic AI Challenges Traditional Networking

Networking as we know it was designed for an earlier era — one optimized to deliver content such as webpages, video streams, and corporate applications. The architecture, pricing, and operational models of these networks are rooted in a paradigm that assumed relatively static applications, predictable traffic patterns, and deterministic outcomes.

The rise of agentic AI fundamentally changes this model. Unlike traditional software, AI systems generate results that shift depending on constantly changing inputs from multiple sources. This makes data movement, sovereignty, and trust central concerns in ways legacy networks were never designed to handle.

Outdated Cost Models

Legacy networks remain priced around costly hardware, complex services, and policy-driven appliances. The economics are not consumption-based but license-driven, often charging per VPN, per tunnel, or per endpoint. On the high end, organizations are billed for large control plane tables, feature tiers, and license renewal every 3–5 years.

This model might have worked in the past, but it breaks under modern AI workloads:

  • High-bandwidth AI pipelines rapidly inflate per-device or per-event licensing.
  • Renewal-based licensing discourages agility and experimentation.
  • Networks designed around static provisioning cannot adapt to ephemeral, on-demand AI connections.

The result: slow, costly, and operationally heavy infrastructure unsuited to emerging AI ecosystems.

Non-Deterministic Nature of AI

Traditional applications are deterministic: the same inputs produce the same outputs. AI systems, however, are non-deterministic. Results change as the data sources evolve, shift, or even degrade.

This introduces profound challenges:

  • Dynamic Inputs: AI agents depend on constantly refreshed data feeds.
  • Reliability: The quality and trustworthiness of the sources directly influence the outcome.
  • Sovereignty: The location of data storage, processing, and movement becomes critical for compliance and national security.
  • Security Risks: A single poisoned data source or compromised agent can distort results with severe consequences.

The Network AI Agents Need

Agentic AI requires a fundamentally different network architecture — one built around three pillars:

  1. Connectivity: High-speed, any-to-any secure communication across sites, regions, and clouds.
  2. Observability: Continuous visibility into both the control plane and data plane ensures contextual awareness of how agents and data flows behave. With path control and sovereignty, understanding how data moves and who it connects with will be essential for Agentic AI.
  3. Control: Centralized authorization and policy enforcement to ensure agents only access what they should, with sovereignty and compliance guarantees.

This architecture must operate with a centralized control plane (for authorization, context, and visibility) and a peer-to-peer encrypted data plane (for high-speed, ephemeral agent-to-agent communication). Unlike legacy systems, no heavy provisioning or device-by-device configuration should be required.

Why Change Is Needed

The shift from an Internet of Content to an Internet of Data demands a new foundation. In the content era, networks optimized delivery of static pages and media. Real-time, dynamic, and ephemeral agent-to-agent exchanges drive business value in the data era.

  • AI connections can be short-lived and high-volume, requiring lightweight, encrypted fabrics rather than static tunnels.
  • Observability must be built in, not bolted on, to ensure agents can be monitored in real time.
  • Costs must be sustainable, moving away from per-device and per-license models to true consumption-based economics.

Networking For A New Era

Agentic AI is pushing networks into uncharted territory. Success will require networks that combine consumption-based economics, sovereignty-aware controls, and real-time observability with the agility to support ephemeral, high-speed agent-to-agent connections.

The networks of the past — tied to hardware, heavy configuration, and feature-based pricing — cannot meet these demands. The future requires networks engineered not just to deliver content, but to power dynamic data ecosystems and intelligent agents at a global scale.