- An exclusive web-based interview with internet pioneer Geoff Huston, Chief Scientist at APNIC. In this feature, journalist Jessi Wu sits down with Huston to weave archival footage directly into a continuous narrative, allowing his own words to drive the story while her questions bridge the gaps between eras.
- Collaboration and open protocols allowed interoperability and wider adoption across networks.
Introduction
The modern internet did not emerge from a grand, top-down design. It was forged in the fires of pragmatic engineering under immense pressure. I sat down with Geoff Huston, a veteran of the internet’s early development and long-time Chief Scientist at the Asia Pacific Network Information Centre (APNIC). We traced the arc of how open standards triumphed over proprietary systems, how explosive growth nearly overwhelmed the network’s core, and why minimalist architectural choices allowed it to scale globally.
Today, Huston argues, the risks facing the internet are less about technical capacity and more about structural fragility, market concentration, and the societal upheaval driven by artificial intelligence.
“We didn’t decide the internet would become the global communications
infrastructure,” Huston told me, leaning back in his chair. “We were just the only ones
left in the room when the adults moved in.“
The Accidental Victory of Open Protocols
I began our conversation by asking about the chaotic landscape of the 1980s. Today, we take universal connectivity for granted, but back then, universities operated in silos, trapped by incompatible proprietary systems where IBM mainframes spoke SNA and Digital Equipment Corporation machines ran DECnet.
“How did you even begin to connect these different worlds?” I asked. “Was there a master plan?”
Huston laughed. “A master plan? Hardly. It was pure desperation.” He explained that the solution emerged not from a committee, but from necessity. The pragmatic adoption of IP (Internet Protocol) offered a vendor-independent method to interconnect networks precisely because it demanded so little uniformity.
In this segment, Huston details the specific frustration of the era: trying to connect terminals that simply couldn’t speak to one another. He recounts looking around the lab and realizing that while proprietary ports were locked down, almost every machine happened to have an available port for the Internet Protocol. He emphasizes that IP won not because it was technically superior in every way, but because it was the only common denominator already present in the room.
“That grassroots victory highlights a fundamental truth,” I noted, reflecting on his story. “It won through flexibility, not force.” While the telecommunications industry pushed for the complex, committee-designed OSI model, the engineering community quietly adopted IP because it solved the immediate problem without requiring a total system overhaul. As Huston put it, this “accidental” standardization proved that coordination without central control was not only possible but superior.
Surviving the Explosion
Shifting gears, I asked about the late 1980s, when the success of these open protocols triggered an unprecedented crisis. The internet was doubling in size every six to nine months—a growth rate that threatened to collapse the very infrastructure enabling it.
“It must have felt like the dam was about to break,” I suggested.
“It felt like the dam had already broken, and we were trying to hold back the water with our hands,” Huston replied. He described how routing tables and addressing schemes were hitting absolute breaking points.
Huston vividly describes the panic of the early 1990s as routing tables began to overflow, threatening to stop the internet from functioning. He explains the technical deadlock: the old class-based addressing system was too rigid for such rapid growth. In the clip, he details the frantic, real-time engineering effort to implement Classless Inter-Domain Routing (CIDR), describing it not as a planned upgrade, but as an emergency patch to variable-sized blocks that barely saved the network from collapse.
Listening to his account, the myth of perfect foresight in the internet’s design completely dissolves. “So scalability wasn’t a feature; it was a repair job?” I asked.
“Exactly,” he nodded. “A continuous process of repair.” The shift to CIDR was a subtle but vital architectural pivot. This era taught the community that the internet’s resilience lies not in a static blueprint, but in the ability of its stewards to execute real-time repairs while the plane is still in flight.
The Power of Minimalism
“Why did this work?” I pressed. “Other ambitious networking projects failed spectacularly during that same period. What was the secret sauce?”
“Minimalism,” Huston answered immediately. “We built a ‘dumb’ network.” He explained that while competitors sought to build “smart” networks capable of managing complexity internally, the architects of the internet chose to push intelligence to the edges.
Huston contrasts the clean, consistent logic of IP with the bloated, contradictory nature of the OSI model. He argues that committees inevitably compromise, resulting in “shocking decisions” and internal conflicts. He credits the success of the internet to the focused vision of a few individuals who refused to add unnecessary features to the core, insisting that the network should do the bare minimum to move packets and leave all complexity to the edge devices.
“It sounds almost counter-intuitive,” I observed. “You succeeded by doing less.”
“Precisely,” Huston said. “By strictly limiting what the core network is required to do, we preserved its ability to adapt. If we had tried to bake every future application into the core, we would have been obsolete within five years.” Instead, by pushing intelligence to the edges, the internet created a platform where innovation could flourish without needing permission from the center.
The Myth of IPv6 and the Reality of Abstraction
Our conversation naturally turned to IPv6. For decades, the industry has been told that the transition to IPv6 is the ultimate solution to address scarcity. Yet, the internet has continued to scale primarily through workarounds like Network Address Translation (NAT).
“Are we just delaying the inevitable?” I asked. “Or has the goalpost moved?”
“The goalpost moved twenty years ago,” Huston corrected gently. He challenged the assumption that every device needs a permanent, unique address, calling it a relic of the mainframe era.
Huston deconstructs the traditional view of IP addresses. He explains that in the modern client-server world, most devices only initiate connections and rarely receive them. Therefore, he argues, addresses are not permanent identities but merely “temporal tokens” used briefly to prevent packet confusion during a session. He details how NAT and overlay systems leveraged this insight to scale the network without needing to replace the underlying IPv4 infrastructure.
“So we’ve essentially transformed the internet from an addressing system into a naming system?” I summarized.
“Yes,” he confirmed. “In a world where most devices initiate connections but rarely receive them, raw addresses matter less than domain names and session management. The solution wasn’t more addresses; it was indirection.” It was a profound realization: we scaled to tens of billions of devices not by replacing the foundation, but by building smarter layers on top of it.
Centralization and the End of Moore’s Law
The tone of our discussion shifted as we looked at the current state of the industry. The internet’s decentralized ethos seems increasingly at odds with the reality of its infrastructure, dominated by a handful of tech giants. I asked Huston if he saw a link between this centralization and the slowing of Moore’s Law.
“For decades, Moore’s Law was our great equalizer,” I noted. “New entrants could always undercut incumbents with cheaper, faster chips. Is that era ending?”
“That protective mechanism is fading,” Huston warned, his expression turning serious. He explained how the relentless progress of the past kept giants in check, but as chip manufacturing approaches atomic limits, the cost advantages of new technology are diminishing.
Huston outlines the economic history of the internet, where Moore’s Law ensured that new players could always enter the market with superior, cheaper technology, forcing incumbents to innovate or die. He warns that as physical limits approach, this cycle is breaking. In the clip, he expresses concern that without the “threat of the future,” giants like Amazon and Google may face no competitive pressure, allowing them to solidify their dominance indefinitely.
“This feels like a systemic risk that goes beyond just processing speed,” I remarked.
“It is,” he agreed. “If the ‘threat of the future’ disappears, these companies face no competitive pressure. We risk entering an era where a few private companies, larger than many governments, operate without checks and balances.” Without the churn of rapid technological obsolescence, these centralized powers may solidify into permanent infrastructures, stifling the very innovation the internet was built to foster.
The Next Crisis: Artificial Intelligence
Finally, I asked about what keeps him up at night now. Is it another technical bottleneck? A security flaw?
“No,” Huston said firmly. “The next crisis isn’t technical. It’s societal. It’s Artificial Intelligence.” He described a future where AI disrupts labor markets and governance structures in ways we are only beginning to comprehend.
“When machines can write code and diagnose diseases better than humans,” I mused, “the fundamental value of human work changes entirely.”
“Exactly,” Huston said. “And we aren’t ready for it.”
Huston shifts focus from engineering to sociology, expressing deep uncertainty about the trajectory of AI. He argues that when AI can perform cognitive labor cheaper and better than humans, the social contract regarding employment and wealth distribution breaks down. He issues a passionate call for governments to intervene, warning that without regulation, the benefits of the digital economy will be captured entirely by a handful of billionaires, leaving society fractured.
“This is no longer an engineering problem,” I concluded, summarizing our hour-long dialogue. “It’s a crisis of social organization.”
“Correct,” Huston nodded. “The danger is no longer about packet loss. It’s about the erosion of the economic and social contracts that hold society together. The hard work ahead isn’t writing code; it’s summoning the political will to ensure the benefits of this digital economy are shared broadly, rather than captured by a handful of billionaires.”
Conclusion
As I waive my good-bye at the Zoom, Huston’s journey from the campus networks of the 1980s to the frontlines of internet governance offered a unique perspective on the technology that defines our age. His message was clear: the internet survived its early growing pains through pragmatism and simplicity, but its future stability depends on our ability to confront the economic and social forces now shaping it. As the era of effortless scaling ends, the hard work of governance—and humanity—begins.
