Skip to main content

Fiber for Breakfast Week 18: The Fourth Pillar of the AI Era

Fiber for Breakfast Week 18: The Fourth Pillar of the AI Era

For years, the broadband industry focused on connecting people to the internet. Faster speeds, broader coverage, better access. This week’s Fiber for Breakfast pushed that idea further. In a conversation with Allan Isfan, Senior Director of Product Management Technology for Global Video Platform & AI at Warner Bros. Discovery, Gary explored what happens when networks are no longer just delivering connectivity but actively supporting how AI runs.

At a technical level, that shift is already underway. As Isfan explained, fiber is no longer just connecting data centers; it’s moving inside of them. “Fiber is becoming the fabric of the entire data center…essentially the bus that AI runs on.” It’s linking chips, racks, and systems together, turning the physical network into a core part of processing infrastructure.

That’s why fiber is emerging as what Isfan calls the “fourth pillar” of the AI era, alongside processing capacity, storage, and power. AI systems depend on moving massive amounts of data continuously, and small constraints in bandwidth or latency don’t just slow performance; they limit the value of the entire system. “If you can’t move data at the bandwidth you need, and with low latency,” Isfan noted, “you’re literally losing money.”

What follows from that is a change in how networks behave. Network traffic is no longer mostly downstream. AI introduces constant back-and-forth movement—east-west, north-south—across data centers, between nodes, and increasingly out to the edge environments. At the same time, inference is moving closer to the user, distributing intelligence across the network instead of keeping it centralized.

And this isn’t a gradual shift. As AI models improve and agent-based systems begin to take hold, network demand doesn’t just grow; it multiplies. Instead of users making individual requests, software begins acting on their behalf, generating continuous activity across the network. That changes the scale and the shape of demand in ways traditional planning models don’t fully capture.

The bigger takeaway is that the role of broadband is expanding. Networks are no longer just delivering content; they are becoming part of the infrastructure that enables how intelligence is created and applied. That has implications not just for capacity, but for how networks are designed, where processing capability lives, and how traffic flows.

Gary summed it up at the end: “Fiber doesn’t connect people to the internet…it connects them to the future.”

That shift reframes the opportunity. As AI continues to scale, the question isn’t just how fast networks are; it’s how well they support what comes next. For a deeper dive into this concept, explore the Fourth Pillar of the AI Era white paper. For those attending Fiber Connect 2026, Allan will expand on these ideas during his two-part workshop on Sunday, May 17. AI Fundamentals Part 1 (3:00-3:50 PM ET) will breakdown the core building blocks—neural networks, training, inference, and the role of fiber—followed by AI Fundamentals Part 2 (4:00-4:50 PM ET), a hands-on session demonstrating real-world AI applications, from prompting to agent-based workflows.

Click here to watch the full interview.

Click here to view the slides