The Abstraction Rises – The Cyber Omelette
The Coding Priesthood Has Fallen: How AI Agents Are Reshaping Software Development
In the time it takes to earn an undergraduate degree, Large Language Models (LLMs) have transformed from delivering realistic chat responses to autonomously coordinating and completing tasks at the scale of entire engineering teams.
Remember when Stack Overflow was your lifeline? That magical place where you’d land when stuck on a coding problem, hoping another developer had already suffered through the same issue and found a solution? Since 2022, however, new Stack Overflow posts have plummeted by 77%. Developers have discovered a new oracle: AI assistants that can generate entire codebases in minutes.
I’ve witnessed this revolution firsthand. I recently built a full-stack iOS app for photographing, indexing, and searching my personal storage bins. The backend was familiar territory, but iOS development wasn’t. So I delegated 100% of the frontend work to AI agents. By afternoon, I had a fully functional prototype. But these tools aren’t perfect—I once tried adding a simple SSL certificate path field, and the agent fixated on an unrelated parameter, refusing to acknowledge its error despite repeated corrections.
Steven Yegge’s provocative orchestration tool, Gas Town, captures this new reality perfectly: “Gas Town is an industrialized coding factory manned by superintelligent robot chimps, and when they feel like it, they can wreck your shit in an instant.” Relatable indeed.
Yet while it’s clear something fundamental has changed, the long-term implications remain uncertain. Some predict AI superintelligence is imminent (for better or worse). Others believe we’re mistaking philosophical zombies for true intelligence, speedrunning our own cognitive decline.
When facing an uncertain future, I find it helpful to anchor predictions in historical outcomes. It turns out that “Vibe Coding” or more specifically “Automatic Programming” has been invented before. Reading this history reshaped my understanding of where we’re headed.
The Priesthood Era
Let’s travel back to the early 1950s. Computing machines had evolved from mechanical calculators to room-sized behemoths capable of ending wars. The ENIAC and UNIVAC represented the cutting edge, with fewer than 1,000 computers existing worldwide.
Programming these machines was brutally labor-intensive. It required punch cards, with programmers defining exact mechanical steps from operator manuals. Adding two numbers meant explicitly choreographing where values would be stored, when computation would occur, and wiring outputs to meaningful addresses. Even basic tasks involved mind-numbing complexity.
The most notable computation of the era was forecasting presidential election winners. A UNIVAC-1 system correctly predicted Eisenhower’s landslide victory using just 5.5% of voter samples. Another groundbreaking application was the SAGE air defense system, consuming 250,000 to 500,000 lines of assembly code and employing 7,000 engineers—approximately 20% of the world’s programmers at the time.
Leading minds considered programming a black art so advanced it could never be generalized for the masses. While simplification efforts existed, like the Laning and Zierler system at MIT, they slowed machines down by factors of five to ten. With computation costing nearly $1 million, that inefficiency was like lighting money on fire. The small group capable of producing efficient, correct code considered themselves exceptionally clever and scoffed at the idea of being replaced. John Backus later referred to this elite group as “The Priesthood”—and he was one of them. But he had plans to disrupt the status quo.
Enter Automatic Programming
As head of IBM’s Programming Research Group, John Backus sought to vastly simplify effective use of the IBM 704. His motivation was refreshingly blunt: “I didn’t like writing programs,” he later admitted. His team aimed to build an abstraction on top of machine code, seeking to simplify logic without sacrificing speed. The stakes were enormous—computers cost nearly $1 million, and more than 90% of project time went to planning, writing, and debugging while machines sat idle.
In the 1956 Programmer’s Reference Manual, Backus made a bold claim: “Since FORTRAN should virtually eliminate coding and debugging, it should be possible to solve problems…”
In hindsight, this seems absurd. There’s a saying that “You can write FORTRAN in any language,” a reminder that any language can produce buggy, illegible code. But compared to hand-coded assembly, FORTRAN delivered. Programs requiring 1,000 machine instructions could be written in just 47 FORTRAN statements. GM’s productivity studies showed FORTRAN reduced programming effort by factors of five to ten. This was significant progress.
FORTRAN wasn’t alone in dramatically simplifying coding. Grace Hopper, a computing pioneer who helped build the UNIVAC-1, was equally convinced coding didn’t need to be so opaque. She drove creation of FLOW-MATIC, a predecessor to COBOL and the first compiled language to adopt English-like syntax. Her motivation was simple: you can’t force a businessman to learn mathematical notation.
Programming languages were moving from esoteric punch card holes toward portable, readable text.
What They Expected
As code became easier to write, not everyone was impressed. In Backus’ own reflections, he describes: “The Priesthood wanted and got simple mechanical aids for the clerical drudgery which burdened them, but they regarded with hostility and derision more ambitious plans to make programming accessible to a larger population.”
The skeptics had economics on their side. Slowdown meant lost money, so Automatic Programming would have to match hand-coded efficiency to be adopted. There was also a human component: programming was a hard-won skill, and implying it could be automated felt like an attack on the craft itself.
Meanwhile, analysts in the early 1960s predicted everyone would have to become a programmer to meet expected demand. Computers were proliferating. Software was essential. The prediction was that programmers would become as common as typists, and technology would be democratized.
What Actually Happened
Despite historically-anchored criticism, the efficiency problem was eventually solved. FORTRAN’s optimizing compiler produced code that ran nearly as fast as hand-coded assembly. The IEEE Computer Society later noted it was “the best overall optimizer for not 5 years, not 10 years, but 20 years.” Coupled with falling computation costs, the performance objection evaporated.
The democratization vision partially came true. COBOL became the world’s most widely used language, but not for use by the general masses. Businessmen still needed trained programmers. Reading and writing code proved to be very different skills. As Turing Award winner Fred Brooks phrases it, simpler programming languages reduce the accidental complexity of a task, but the essential complexity remains. You still have to know what you want the computer to do, and that can be very hard.
While not everyone wrote computer programs, the number of computers in the world exploded. This count grew from under 1,000 in 1960 to over 7 billion smartphones alone today. Computers became daily tools for a majority of the world, not just for business.
With easier programming, teams didn’t shrink either. Paradoxically, they grew. The U.S. went from 200,000 computer workers in 1970 to 1.6 million by 2015, with estimates of 26-28 million globally. As a percentage of the S&P 500, technology went from around 3% when the index launched in 1957 to over 32% in 2024—now more than double the next largest sector.
The Priesthood lost its grip, and the black art of telling computers what to do made it to the masses.
Looking Back, Looking Forward
This exploration began with history. When reading John Backus’ reflection on being a programmer in the 1950s, it completely matched my own views from my days optimizing GPU kernels as an HPC Developer.
He states: “The programmer had to be a resourceful inventor to adapt his problem to the idiosyncrasies of the computer… he had to employ every trick he could think of to make a program run at a speed that would justify the large cost of running it. And he had to do all of this by his own ingenuity.”
When first reading this quote, I realized I am part of the modern Priesthood. Seventy years later, it still fits how I and others think of our craft. But history shows even the sharpest skills will be made obsolete; the need for niche expertise to push computation to its limits will remain. What changes is the problems they will be solving.
There’s a saying in cycling—”It never gets easier, you just go faster.” Perhaps with knowledge work, it doesn’t get easier, the systems just get more complex. Before FORTRAN, crude election forecasts and tracking a handful of aircraft were state of the art. The next 50 years brought weather forecasting, universe-scale simulations, and the codification of the human genome. This wildly exceeded anyone’s imagination at the dawn of the compiler era.
This lesson predates FORTRAN as well. When James Watt optimized the steam engine, the expectation was that coal use would plummet. Instead, it ballooned. This is known as Jevons Paradox, and it continues today. In 2016, Nobel Laureate Geoffrey Hinton predicted that radiologists would be obsolete within 5 years. Ten years later, radiologist MD Dana Smetherman notes that not only is demand strong, “AI might even increase the workload by identifying additional findings.”
When primed with lessons from history, I find my own technological arrogance fading. My concerns about obsolescence have shifted toward curiosity about what remains to be built. The accidental complexity of coding is plummeting, but the essential complexity remains. The abstraction is rising again, to tame problems we haven’t yet named.
Tags & Viral Phrases:
- Vibe Coding
- Automatic Programming
- The Priesthood
- Gas Town
- Superintelligent robot chimps
- Black art of programming
- Accidental complexity vs essential complexity
- Jevons Paradox
- Philosophical zombies
- Speedrunning our own brainrot
- Coding factory
- Stack Overflow is dead
- 77% drop in Stack Overflow posts
- AI agents can wreck your shit
- You can write FORTRAN in any language
- It never gets easier, you just go faster
- The abstraction is rising again
- What remains to be built
- Technological arrogance
- Modern Priesthood
- From punch cards to AI agents
- The coding revolution nobody saw coming
- When everyone becomes a programmer (and nobody does)
- History rhymes in silicon
- The next 50 years will exceed our imagination
- The essential complexity remains
,



Leave a Reply
Want to join the discussion?Feel free to contribute!