Skip to main content

The AI Wave Really Is Like the Early Internet — and I Almost Missed It the First Time

Gabe Hilado
Founder and CEO, Zenpo Software Innovations

Everyone talks about the first time they used AI to generate code like it was a revelation. Like the clouds parted. Like they immediately saw the future. That's not what actually happened for most experienced developers. What actually happened was a knot in the stomach.

Late 2023. ChatGPT writes a function. It works. Not perfectly — but it works. And the reaction isn't wonder. It's a quiet, specific dread. The kind that shows up when something you've spent two decades mastering suddenly looks like it might be on a timer — a feeling that only intensified when Cognition introduced Devin as the first AI software engineer and the timeline felt like it lurched forward overnight.

That's the part nobody puts in their LinkedIn post. The anxiety came first.

Is anxiety about AI tools a normal reaction?

Not just normal. Predictable. If you've been building software long enough to have opinions about it — long enough to have shipped production systems, survived rewrites, outlasted framework wars — then watching a chatbot produce passable code in seconds should unsettle you. It means something fundamental changed. The people who weren't unsettled either weren't paying attention or hadn't been doing it long enough to understand what they were looking at.

The hype crowd skipped straight to breathless enthusiasm. The cynics dismissed it as a parlor trick. But the practitioners — the ones who'd actually written the kind of code the AI was now approximating — they felt something neither camp could explain. A recognition. Not of the tool's capability, but of the shift's weight.

That anxiety sat there for months. Through the early prompt engineering experiments, through the first real workflow changes, through the debates about whether AI-generated code was production-worthy or just a novelty. It didn't resolve quickly because the implications didn't resolve quickly.

When does the anxiety give way?

Sometime in 2025, the feeling changed. Not gradually. More like a phase transition — one state replacing another completely. The anxiety didn't fade into cautious optimism. It gave way to something closer to awe. Genuine, unguarded wonder at what was now possible in a single working session.

The shift happened when the tool stopped being something to evaluate and started being something to work with. When the conversation moved from "can this really do what it claims" to "what can we build now that this exists." When a three-day design cycle replaced what used to take three months with a team of eight. When deliberation stopped being a calendar problem and became a Tuesday afternoon.

That's when the analogy clicked. And it clicked because I'd felt this exact thing once before, and I'd handled it badly.

How does the current AI wave compare to the early internet?

Not the internet of 1999, with its IPOs and sock puppets and irrational exuberance. Not the internet of 2004, when the platforms were already consolidating. The internet of 1994.

I was at George Mason University then. Computer science major. The curriculum had us writing Pascal, Lisp, Prolog, and C/C++ on the lab UNIX boxes. Nobody on the faculty was certifying any of those as the future or as strategically serious — they were just what the curriculum happened to use. C/C++ were the durable ones in that list, the languages that would still be paying mortgages thirty years later, but the way we were taught them was the academic UNIX version: write the program, compile it, run it against the assignment, hand it in. The same language that was quietly building the infrastructure of the thing happening upstairs in the dorm rooms was, in my hands, a classroom exercise. Familiar. Easy. In front of me. I leaned into all of it for the same reason anyone leans into anything: it was the path of least resistance, and I was getting good at it on the timeline the institution rewarded.

The actual revolution was happening upstairs from the lectures, after class, in the dorm rooms and computer labs. AOL. The modem screech. The "You've Got Mail" chime. Warcraft 2 sessions stretching past midnight. Chatrooms full of strangers in other time zones — the first time in human history that was casually, cheaply possible. None of it felt important at the time. It felt like fun. It felt like the part of the day that didn't count.

If you were there — actually there, not reading about it later — you know exactly what that feeling was. You couldn't draw a diagram of where it was going. You couldn't pitch a VC on what it would become. You couldn't explain to your neighbor why this scratchy, slow, dial-up connection to a blue-and-white interface was going to change everything. You just knew. Somewhere in your nervous system, before your analytical brain caught up, you recognized that something irreversible had started.

That's the feeling. Not the technology comparison — TCP/IP versus transformers, browsers versus chat interfaces. Those parallels are interesting but they're not the point. The point is the gut recognition. The moment when the size of the wave becomes undeniable even though you can't see the shore it's heading toward.

And here's the part I want to be honest about: I felt it. I went back to Pascal in the morning.

How do you know when a technology shift is real vs hype?

You don't know in the way you'd like to know. Not with data. Not with market projections. Not with Gartner quadrants.

Back then, nobody had a coherent thesis for what the internet would become. Nobody predicted Google, Amazon, social media, streaming, cloud computing, the entire digital economy. The best predictions from the smartest people were laughably wrong in retrospect. The point isn't that they were dumb — they weren't. The point is that real platform shifts are definitionally unpredictable in their specifics. That's what makes them platform shifts and not incremental improvements.

What you could know then — what you could feel in your hands as you used it — was that the connectivity itself was the disruption. Not any particular application of it. The raw fact of being connected changed the physics of information, and everything downstream from that was going to reorganize. You didn't need to predict the specific reorganization. You just needed to recognize that the reorganization was inevitable.

AI is the same signal. Not because "AI is like the internet" in some hand-wavy thought-leader sense. Because the gut feeling is identical. The raw fact of machine intelligence — imperfect, uneven, still early — changes the physics of knowledge work. Everything downstream from that is going to reorganize. Predicting the specifics is a fool's errand. Recognizing the inevitability is not.

Recognizing it, though, isn't the hard part. The hard part is what you do in the four years after you recognize it.

What is the risk of waiting to adopt AI tools?

I'm going to answer this by telling you what waiting cost me, because the abstract version of this answer is useless and the concrete version is the only one that ever changed anyone's mind, including mine.

I'd interned at MITRE in the late 90s writing what we now politely call "classic ASP." I had paid web reps. I had shipped real web code. None of that mattered, because after graduation I took a job that put me on a database project written in VB6, and within a year I was telling myself the desktop work was the more serious one. The internship became something I'd done once, in another life, before I got a real job. The web track didn't go away, exactly. It got demoted. I kept writing web code on my own time the way other developers wrote game engines — as the unpaid, after-hours, for love version of the craft. The VB6 work was the paid track. Both tracks ran in parallel for years and never merged.

That isn't how I would have explained it at the time. At the time, the explanation I would have given you — and the explanation I told myself — was that I saw desktop application programming and web programming as coequal. Two things on a menu. Pick whichever fits the project. They were peers in my mental model, different tools for different jobs, and I happened to specialize in one of them the way another developer might specialize in the other. That framing felt reasonable. It felt like the responsible adult version of having a craft.

It was a cope, and the truth underneath it is the only thing in this essay that matters.

The truth is that I felt desktop programming was serious programming, and I felt web programming wasn't. Desktop had events. It had error handling and exit codes. It had memory leaks you had to hunt down, ActiveX quirks you had to work around, COM lifecycles, the whole hard texture of Win32 work that demanded you actually understand what was happening underneath. Web — classic ASP, ColdFusion, Perl CGI, JScript — was procedural. You ran it from top to bottom. There were no events. There was no real error handling worth the name. There were no exit codes. It was script. It was what you did when you weren't doing the real work.

(My web programmer peers, for the record, thought they were the serious ones — because they were banging out classic ASP in Notepad while I sat inside the fancy VB6 IDE like a tourist who needed training wheels. Two tribes, looking across the aisle, each seeing a lesser version of the craft. Both tribes were about to find out that one of them had been right about the technology and wrong about the world, and the other had been wrong about the technology and right about the world, and which side you were standing on was going to determine the shape of your next decade.)

Here is what I want every reader to sit with for one full breath before reading the next paragraph: the taste judgment I was making was 100% accurate as a description of the technology and 100% wrong as a prediction of which one would matter. I wasn't being lazy. I wasn't being stupid. I was being a good engineer, faithfully applying the aesthetic my craft had trained me to apply — and that aesthetic was a high-precision instrument calibrated to a world that was about to stop existing. Classic ASP really was simpler than COM+. Perl CGI really was less rigorous than Win32 event handling. The taste was right about everything except the only question that ended up mattering, which was which one of these is going to eat the other.

I want to be specific about when I figured this out, because the when is the most important part of the story.

The moment came in 2002. ASP.NET launched. I started watching the salary ranges for web developers drift upward and leave Windows desktop programmers behind — not in some abstract industry-trends way, in a I can read the job postings myself way. The market was repricing the work in front of me, in real time, against me. The signal wasn't ambiguous. I saw it clearly. I read it correctly.

I'd been writing web code since the MITRE internship in the late 90s. I felt the signal in 2002. I wouldn't take paid web work as my primary craft until 2006.

That's the sentence. That's the entire essay compressed into three lines. Nearly a decade of hands-on web exposure. Four years between recognition and action. And the obstacle wasn't ignorance — I had everything ignorance would have prevented. The obstacle was a taste judgment I was still defending in 2002 because it had served me well enough through 2001 and I couldn't get my hands to do something my craft told me was a downgrade. The conditions were never going to be better. The conditions were as good in 2002 as they were ever going to be. I just couldn't get my feet to move against my own aesthetic.

My first paid web project as my primary craft was in 2006, at the Department of Defense. By my own honest estimate it set me back six to seven years — not from where I'd have been if I'd never used VB6, but from where I'd have been if I'd moved in 2002, when I first saw the line bending. That tax didn't cripple me. I'm still here, writing this. But I can still feel it almost twenty years later, and I can describe it precisely enough to put a number on it, which should tell you something about how invisible-but-permanent this kind of cost is.

Here is the part I think the people I'm writing this for need to hear directly. If you are mid-career, the tax you'll pay for the seeing-and-waiting gap is years of compounding intuition you can't buy back at any price — instincts the developers who started experimenting in 2023 are accumulating right now, while you evaluate. If you are late-career, the tax compresses against a shorter remaining runway, and there's a dignity cost on top of it: being a visible beginner in rooms full of people who used to look up to you. Same mechanism, different blast radius. Neither version kills you. That is exactly what makes both versions so easy to keep paying.

What did the AOL era teach us about platform shifts?

There's a comforting story people tell about technology adoption — that you can always catch up, that late movers have the advantage of learning from early mistakes, that the tortoise beats the hare. It's true for individual products. It's not true for platform shifts. You can catch up to a specific tool. You cannot catch up to years of accumulated instinct about how a new paradigm works.

The window of easy entry — when everyone is a beginner, when the playing field is level, when experimenting is cheap and low-stakes — closes quietly. Not loudly. Quietly. By the time the people who waited come around, the early movers aren't just further ahead. They're operating in a different context entirely. The conversation has moved. The assumptions have shifted. The baseline has risen. You don't notice the moment you fall off the back of the pack, because the pack stopped looking at you a long time ago.

The gap isn't knowledge. You can read about AI tools in 2026 just like you could in 2023. You can take the courses. You can watch the talks. The gap is intuition, and intuition only comes from reps — small bets, small failures, small insights, quietly compounding for the people already doing the work while everyone else is still deciding whether to start.

This is what almost missing it actually looks like in practice. Not a single dramatic failure. A slow quiet repricing of what your skills are worth, against your will, while you tell yourself the timing isn't right yet. The people who fully missed the early internet are not the cautionary tale of this story. They're the easy version of the cautionary tale. The harder version — the version that should keep you up at night if you've already felt a flicker of recognition about AI — is the version where you see it clearly, you read the signal correctly, and you wait anyway, and later you do the math on what the waiting cost.

If you have twenty or thirty years in this craft, here is the part I most want you to hear, because it is the part nobody tells you and the part I would have most needed to hear in 2000. Your taste is the trap. The instincts that make you respected in your current work — the ones that recoil from sloppy abstractions, that demand correctness guarantees, that insist on understanding the system before changing it, that distrust anything that feels like cargo-culting — those instincts are real, hard-won, and load-bearing for the world you trained in. They are also exactly what's going to steer you wrong on AI, the same way mine steered me wrong on the web. AI-assisted coding will fail every test your taste applies to it. Hallucinations. Brittle reasoning. No model of invariants. No grasp of architectural constraints. Confident output that's subtly wrong in ways that will cost you a day to find. Every one of those observations is accurate. That is what makes them so dangerous. They are not the noise of a hype cycle. They are the signal of a craft trained on a world that is already starting to disappear, faithfully applied to a thing that does not belong to that world. The taste is correct. The taste is also the obstacle. Both at once. The only way through is to notice this is happening and override it on purpose, because it will not feel wrong from the inside. It will feel like having standards.

The AI wave isn't a product launch. It's not a framework choice. It's not a vendor decision you can revisit next quarter. It's a platform shift. The same play, different stack. Every week you spend evaluating instead of experimenting is a week of intuition you are not building, and intuition is the only thing in this game that doesn't compress on demand. The clock that started ticking when that first ChatGPT-generated function compiled and ran doesn't pause while you think about it. It didn't pause for me in 2002. It will not pause for you in 2026.

Nobody knew exactly where the internet was going. That was never the point. The point was that you could feel it. And what you did with that feeling — whether you moved toward it or stood still — determined the next decade.

The feeling is back. Same weight. Same uncertainty. Same inability to draw the full picture. Same quiet, undeniable recognition that the ground just shifted under everything.

I'm telling you all of this because I felt it the first time and I waited four years anyway. I'd like to believe I'd handle it differently now. Mostly, I just want to make sure you don't have to find out the hard way that you wouldn't.

What you do with this feeling, this time, is the only question that matters.