
Put them all in a row.
There’s a lot swirling right now: Claude Code getting good, AI-powered political messaging, McDonald’s making a depressing AI ad everyone hates, OpenAI going “code red,” Trump giving Nvidia the green light to sell some chips to China, and on and on.
On a personal level, despite my co-founder’s best attempts, I keep moving forward on my Claude Code experiments. I’m grasping the contours of what I can do with AI in coding and it’s no longer quite so brain-melting—I’m getting used to it, as always happens—but it also leaves me convinced that next year is going to be a different kind of year.
I recently had a good conversation with Dan Shipper on the Every.to podcast about some of these ideas, but I wanted to zoom out a little and just describe what I think is happening. My observation is simple: Making a computer behave in a deterministic, predictable way requires a lot of skilled human labor, but, starting now, it will require less skilled labor.
Want more of this?
The Aboard Newsletter from Paul Ford and Rich Ziade: Weekly insights, emerging trends, and tips on how to navigate the world of AI, software, and your career. Every week, totally free, right in your inbox.
This kind of change has happened before. For example, I was an English major in college, not a computer scientist; building web pages to publish my own stuff was how I learned to program. But it took me many years of learning before I could build a substantive platform. My hunch is that, before 2026 is done, a smart non-programmer who can understand high-level concepts will be able to build a working, usable platform—say, a basic TikTok clone—with a few weeks of high-level training. Not solving for everything—security, maintainability, privacy, optimization, mobile app design, or the million other horrific tasks involved in building and managing software. But still.
So what? We know AI does stuff fast. A lot of conversation about generative AI has focused on “soft skills” type of tasks. Can it soothe an anguished customer? Be a therapist? Produce “real” images, prose, and music? Is it thinking? Will it turn into AGI? Is this good or bad? These tend to be questions without answers, as you well know. Around and around we go.
But programming is different. If you zoom way out, programming is about taking something very soft—human desires or needs—and making them follow a deterministic model. That’s a statistical term, but software people use it a lot. Wikipedia says, “a deterministic system is a system in which no randomness is involved in the development of future states of the system.”
So: My soft brain tissue generates thoughts, so I turn on my computer, which boots up reliably. My keyboard works, my operating system works, my web browser works, and Google Docs works. I can type this paragraph with high confidence that when we paste it into Mailchimp and email the newsletter out to lots of people, they will all see, with certain well-understood caveats, the exact same set of words. (Thanks for reading!)
Over decades, millions of people have been involved in making this all work so well. They created network stacks, JavaScript libraries, web browsers, WiFi routers, and so forth. It was understood that this work was very, very hard and very, very error-prone, and that everyone involved must be paid very, very well.
The instructions we type into the computer (or copy and paste) break so much that most of the work of programming is actually debugging. And it sometimes took years to build a large robust software product, but taking something complex and human and making it deterministic—like health records, or college application management tools, or sales lead managers—drove so much productivity that we literally organized our economy around it, and coders became a big part of that.
All of that is going to change. It’s changed before. You used to need to know machine code, and programming languages and compilers showed up; you used to log into a mainframe, then personal computers showed up; you used to send out CD-ROMs to install programs, and then the web let you deliver software right in the browser.
Now you can describe, in plain language, what software you need, and an LLM-powered tool will break it into subtasks, take a swing at each one, test the swings, and then use the debugging information to take another, more refined swing. Given time and a little steering, they can—not always, but increasingly—provide well-tested, well-architected code that looks and behaves like code written by large teams over many months.
There are exceptions, but there’s a real thing here, too. It’s working surprisingly well today; a month from now, I expect it will be even better. The system is starting to improve itself. In a year, I think we’ll start to see the real limitations—at least I hope we do.
There’s a reasonable assumption that this is happening so rapidly that it will eat up many of the millions of jobs mentioned above. Then again, if the past is a guide, this new capability will not lead to the destruction of the software industry. Every one of the giant step changes had the effect of bringing more people into the software industry.
So maybe what this means is that (1) way, way more people can build custom software; and (2) that software can be built much, much more quickly than before. If you can think algorithmically at all, you can build software this way. You need to be able to describe the deterministic states you want to achieve, and the computer can take it from there. But you don’t need to know syntax.
What would the world look like, I wonder, if, say, half a billion people could make software—ten times as many as now? Or five billion? I love software and I love mess; I say it would look awesome.
There are a lot of people who I think of as basically programmers; they just don’t usually work in code: Deep spreadsheet users, musicians yelling out chord changes, sociologists, pollsters, music producers, research clinicians, baseball nerds, and climate scientists. They think algorithmically, they get data, and they understand that you put information into a system and get different information out. All of them are going to be able to do really complex software things in the future: Build native mobile apps, build database-backed web platforms, program sensors and microcontrollers, filter tons of data, migrate huge legacy databases. Of course they have to want to.
These things once required teams of engineers and hundreds of thousands, or millions, of dollars, and now they…may not? This can be a panic-inducing thought, but I hope people can think more broadly. Now that deterministic behavior and all of its benefits are available to the masses, how can we welcome in a new group of algorithmic tinkerers? What databases can we help them convert, what websites can we help them set up, what dashboards can they use to do a better job? How can we bring the magic of taking something soft and making it repeatable to everyone?
And in the reverse, what can these groups bring to the table? Can sociologists bring us their simulations of mass behavior? Can scientists translate their models so that we can use them in our systems? Can those musicians build systems that teach us how to play piano a little better? When code is no longer a barrier to entry, but rather a capability of a system, what kinds of new conversations can we have?
In short: I spent the bulk of my career saying “no” to new ideas and features because of cost, and it always broke my heart. Now, it feels like I’m in a position to say “yes.” We started talking about some new, weird website designs to explore and someone said, “It’ll take two months,” and I said, “Why not just do it in a coworking session next week and see where we get?” I recently heard from a not-for-profit focused on hunger relief that’s having a hell of a time managing multiple databases of donors. I can just say “yes” to them, and figure out the details later. This is new.
I can feel a lot of eyes rolling, including my co-founder’s. I get it. But I keep building apps and doing test migrations and building component systems, and I’m starting to sit down and teach people what I know. I like the spongey parts of culture, and I like determinism. I think it’s a bit of a miracle that I get to see all those boundaries collapse at once. Sometimes it feels like everything is a disaster, but at the same time, we live in a miraculous age—and I would hate to waste a miracle.