The Golden Age of the Programmer: Are You Taking Advantage of It?

Every time the cost of programming has fallen, the number of developers has grown. That's why I think we're in the best era to be a developer.

Contributors: Ivan Garcia Villar

For two years now, I’ve been hearing that programmers are going to be out of work. I’ve heard it from consultants, investors, journalists who’ve never compiled a program in their lives, and occasionally from colleagues in the industry having a bad day. I’ve thought it myself sometimes. Today, I think they’re all wrong. And I think history speaks volumes on this point, so I won’t mince words: we’re entering the most exciting era in history to be a programmer. I’m not saying this to reassure anyone. I’m saying it because the data, for seventy years, has consistently pointed in the same direction.

The Same Old Fear

The question everyone asks is “will AI replace me?” But that’s not the right question.

The right question is: what happens when building software costs half as much?

The historical answer is always the same. You build twice as much. Or three times as much. Or you build things that didn’t exist before because they weren’t economically viable. The number of people building those things doesn’t shrink — it grows. The fear of technological replacement isn’t new, and in this industry it hasn’t been right once. Maybe this time is different. But I need more evidence before I believe it.

This Has Happened Before — Three Times

We’re not facing something unprecedented. We’ve lived through versions of this same moment at least three times in the history of software, and every single time the result was identical.

The 1950s and ’60s: from punch cards to FORTRAN and COBOL

Before FORTRAN in 1957, programming required understanding the physical architecture of the machine. It was expensive, specialized, exclusive. Not for everyone — it was for a very specific technical elite, and the software that resulted reflected that cost. FORTRAN let scientists and engineers write mathematical formulas directly, without going through assembly. COBOL (1960) was explicitly designed to let more people program, including business people who weren’t hardware engineers.

The goal was to lower the barrier. They succeeded.

The result? In the 1960s, newspapers ran stories about a brutal shortage of programmers. Demand in banking, government, and industry had exploded. According to period reporting collected by Click Americana, in Denver in 1967 a freelance programmer said he could put a hundred more programmers to work in the financial sector of that city alone.

The tool became more accessible, the number of problems solvable with software increased, and demand for people who could solve them exploded along with it.

The 1990s and 2000s: frameworks, the web, and the weekend developer

With Rails, Django, jQuery, PHP — building a web application went from requiring a team of experts to something one person could do over a weekend. The barrier dropped again.

The result: the JavaScript community grew from around 10.7 million developers in 2018 to more than 20 million in just a few years, according to SlashData data. The global developer community roughly doubled in a decade — and notably, SlashData’s 2025 data shows that while amateur developers actually decreased by more than one million, professional developers kept growing. The expansion was driven by people doing this as their main work, not by casual tinkerers. A three-person startup could suddenly build a product for millions of users. Things that were previously economically impossible became viable. And that required people to build them.

The 2010s: mobile, cloud, and invisible infrastructure

App stores, AWS, infrastructure as a service. Launching a global product without ever touching a physical server. Again the cost dropped. Again more people appeared building more things. Not fewer specialized roles — more. Cloud architects, DevOps, SREs emerged — profiles that didn’t exist before and are in high demand today.

The pattern has been consistent for seventy years. At this point, it’s hard for me to ignore it.

What the Numbers Say

The historical context in a single table:

EraWhat lowered the barrierThe fear at the timeWhat actually happened
1950s–60sFORTRAN, COBOL”Hardware specialists won’t be needed”Massive programmer shortage; demand exploded in banking, government, industry
1990s–2000sRails, Django, PHP, jQuery”Anyone can build websites, devs will lose value”The JS community went from ~11M to +20M; developers multiplied across the industry
2010sAWS, App Stores, SaaS”Infrastructure becomes commodity, sysadmins disappear”Cloud architects, DevOps, SRE emerged — new roles with even higher demand
2020s–todayCopilot, Cursor, AI agents”AI is going to program for us”TBD, but the historical pattern suggests: more developers, different problems

In 2000, there were around 680,000 software engineers in the United States, according to the Bureau of Labor Statistics. Today there are more than 47 million developers worldwide, according to SlashData estimates. That’s not an apples-to-apples comparison — the 2000 figure is US-only while the 2025 figure is global — but even within the US, the BLS shows software developer employment roughly tripled between 2000 and 2024. Across every abstraction layer added over seventy years, the number didn’t contract. It multiplied.

There’s a nuance worth being honest about: the BLS category “computer programmers” — a narrow job title — has indeed declined to levels not seen since the 1980s, as Fortune reported in March 2025. But the broader “software developers” category continues to grow. This distinction is actually the whole point: the role evolves, the category label changes, the demand for people who build software does not disappear. If anything, this pattern repeats the COBOL clerk narrative — the specific job title shrinks while the underlying work expands into new roles.

I may be wrong about what comes next. But I need to see evidence against seventy years of consistent pattern before accepting the replacement narrative.

What Changes Now — in a Good Way

Something genuinely is different this time: the pace. Previous abstractions took years to democratize. Rails needed several years to reach mass adoption. AWS needed nearly a decade to become the default infrastructure. With AI, the cycle is much shorter. GitHub Copilot has been available to the public since 2022, with a technical preview since 2021; by 2025, most developers on active projects have some AI tool at hand.

But the most interesting thing isn’t the speed. It’s what changes in the daily work.

We fight less with syntax. More with what matters: designing solutions, making architecture decisions, validating that what’s being built actually makes sense, taking responsibility for outcomes. Part of what we used to do gets automated. That’s good — the same way it was good that FORTRAN automated assembly.

A developer working with AI doesn’t produce the same software faster. They produce different software that was previously not economically viable to build. I’ve seen this directly: teams shipping internal data tooling that used to require a dedicated platform team, backoffice automation for a niche of two hundred companies that no SaaS vendor ever served, agents that handle domain-specific workflows that were simply too expensive to automate before. That’s the distinction that matters most to me, and the one least mentioned in the debate about the industry’s future.

The End of General-Purpose SaaS

There’s a consequence of cheaper software that I think is still poorly understood: the end of general-purpose SaaS as we know it.

Today’s SaaS products are generalist by economic necessity. It was never viable to build and maintain software specifically for a niche of two hundred companies. The development cost only made sense serving large markets with a single, barely customizable product. Software for everyone and optimized for no one in particular.

When that cost drops, the argument breaks down.

Some companies will choose fully custom solutions. Instead of paying monthly licenses for software that covers 70% of their needs, they’ll have their own tool that covers 95%. Others will keep using SaaS — but micro-SaaS products, very specific ones that didn’t exist before because they weren’t profitable to build: markets no one served because the size didn’t justify the investment.

In neither case does the result mean fewer developers. Where before a team of ten people supported thousands of customers with the same product, dozens of different teams can now exist building different products for markets that previously no one saw.

That said, the incumbents aren’t sitting still. The major SaaS platforms — Salesforce, ServiceNow, the big collaboration suites — have real moats: network effects built over years, compliance certifications (SOC 2, ISO 27001) that take enterprises years to approve, and they’re actively integrating AI themselves. I’m not predicting the disappearance of SaaS. I’m predicting the erosion of its competitive advantage in verticals where it used to be the only viable option. The competitive differentiation becomes real, and that creates space for more teams, more products, more problems to solve.

When “Programming” Stops Being the Right Word

There’s a thought I’ve been turning over for a while and I think deserves saying directly: “programming” as a term might stop making sense sooner than we think. And that doesn’t worry me.

“Typist” stopped being a profession when word processors arrived. But the ability to write quickly and accurately didn’t disappear. It redistributed: everyone started typing on keyboards. The profession dissolved because the skill became universal.

Something similar might happen with “programming” in the strict sense. If anyone can generate working code from a natural language description, the barrier of “knowing how to program” fades. But what doesn’t disappear is the need for someone who knows what to build, why, and how to validate that what was built actually solves the problem.

That’s always been what’s valuable. The language, the framework, the tool — those are the medium, not the end.

And if the day ever came when there were no more problems to solve… that would be a wonderful world. Probably very boring. Let’s hope it takes a while.

Loading exercise...

Three Misconceptions Worth Dismantling

Confusing tool automation with the disappearance of the craft

The most widespread mistake: thinking that because AI can generate code, no one needs to understand it, validate it, and make decisions about it. A carpenter using a power saw doesn’t have less work than one using a hand saw. They have more — and can tackle projects that were previously impossible. The more powerful tool doesn’t eliminate the craftsperson: it expands what the craftsperson can build.

Treating the current demand for developers as if it were a ceiling

The implicit reasoning goes like this: “if AI can do X, there will be less work for developers.” That argument assumes the amount of software the world wants to build is fixed. It isn’t. As soon as the cost of building software drops, new categories of problems appear to solve, new companies that didn’t exist before, new products that previously weren’t profitable. Demand isn’t a ceiling. It’s a floor — one that has historically always risen when the barrier dropped.

Forgetting that AI also generates new complexity

This is the one least mentioned, and the one that interests me most. AI agents need orchestration, verification, context management, integration with existing systems. None of those problems are trivial. Someone has to solve them. And right now, that someone is a developer.

If You Want to Make the Most of This Moment

This isn’t a career plan. These are the things that, in my case, have been useful for not losing the thread amid all the noise:

  • Practice delegating routine implementation to AI tools, without losing the ability to read and validate what they generate
  • Develop judgment for knowing what problem deserves to be solved with software. AI doesn’t do that for you
  • Learn the patterns that let you build with AI agents: orchestration, verification, context management
  • Identify a domain where you can build something that wasn’t previously profitable to build
  • Keep writing your own code. The person who only delegates loses the intuition for what’s possible and what isn’t

For example, validating and orchestrating AI agents requires someone to structure the flow:

This is exactly the kind of work AI generates, not eliminates: someone has to think about how to validate, how to orchestrate, what to ask for and why.

A few years ago I wouldn’t have considered it viable to share what I know and how I think as a project. Today it is. Not because AI writes for me, but because it helps me build this — not losing ideas along the way, giving them the shape they deserve, reaching a level of quality I wouldn’t have the time or patience to reach alone.

AI didn’t replace me. It unblocked me.

Will AI reduce the number of programming jobs?

Possibly in certain specific profiles and at some companies in the short term. It’s worth acknowledging that the BLS category “computer programmers” — a narrow job title distinct from “software developers” — has declined sharply, as Fortune reported in March 2025. But the broader “software developers” category has continued to grow. This isn’t a contradiction: the role evolves and the category label changes, while the underlying demand for people who build software persists. The historical pattern suggests the net long-term effect is more developers, not fewer, because new categories of problems appear to solve. The short term may be turbulent. The underlying trend, based on seventy years of history, points in another direction.

What kind of developer has the most future in the AI era?

The one who knows what to build and why, more than the one who executes precise instructions in a specific language. The ability to design systems, make architecture decisions, validate results, and understand the problem domain — that’s what AI doesn’t replace yet. Syntax and routine implementation get automated. Judgment doesn’t.

Does this mean anyone can be a developer?

The syntax barrier is flattening, yes. But the judgment barrier — knowing what to build, how to structure it so it’s maintainable, and how to validate that it works in real conditions — that doesn’t disappear. If anything, it becomes more important. The developer who only knew how to write correct code has less of a competitive edge than before. The one who knows how to think through solving problems has more.