Today on the AI Daily Brief, should anyone learn to code anymore? The AI Daily Brief is a daily podcast and video about the most important news and discussions in AI. To join the conversation, follow the Discord link in our show notes.
Hello, friends. Quick note, traveling today, so we are only doing a main episode. It's a really fun one, a really dynamic conversation that's happening right now on X, Twitter, and everywhere else about whether people should learn to code. It gives us a chance to think about vibe coding, the future of AI. So I hope you enjoy this conversation. We'll be back at the beginning of next week with our normal format again.
Welcome back to the AI Daily Brief. Today we are picking up on a conversation that is both hot on X right now, but is also part of a much longer, bigger conversation that's been happening for some time now as the improvement in AI coding has gotten more and more advanced.
And that conversation is, of course, whether anyone should learn to code anymore. Now, the specific provocation for this version of the conversation came when the CEO of Replit, Amjad Massad, tweeted a clip of an interview with himself with the simple message, I no longer think you should learn to code.
Keep in mind, this is the CEO of Replit, a company that is specifically about coding and building applications. Now, beyond that conversation, the context for this is, of course, the rise of Bolt, and in general, this new move towards vibe coding, where text-to-code tools are allowing people to create things in a way that was never possible before.
Ultimately, what this is really about is just how good at coding AI is getting. Claude 3.5 and now 3.7 have been the standard for some time. And just this week, we got Gemini 2.5, which the early buzz on has some suggesting that it's even better.
All of this recently led Anthropic CEO Dario Amadei to predict that AI will be writing 90% of code within the next few months and really closer to 100% of code in the next year or so. In fact, he said that the only constraint or factor holding that back will be the human inertia and the systems inertia of things like business and enterprise systems, not AI's capabilities.
So this is the context for these comments from the CEO of Replit, and this started an absolute firestorm of discussion. Of course, part of why programmers are so interested in this is that one of the uncomfortable possibilities is that a tool that they've created is actually going to replace them first before it replaces anyone else. So let's talk first about what I think is inevitable.
I think basically that Dario's right. I think that even to the extent that AI isn't writing 90% of code in the next few months, it will only be because of human systems inertia, not because of its capabilities. And I think that over time, the improved efficiency and capability of AI will beat out whatever systems inertia it faces.
There is plenty of evidence of this if you look around. Y Combinator recently made headlines when they said that for about a quarter of their companies, 95% of the code was written by AI. At the end of last year, Google CEO Sundar Pichai said that over 25% of new Google code was being generated by AI.
The point is that things are changing. And when it comes to this question of whether you should learn to code or not, that question is going to take place in the new context of AI writing a ton of the code. In fact, likely the majority of the code.
So what is the con argument? What is the reason not to learn to code? To some extent, it's fairly obvious. And it's really just about time efficiency and waste. If AI is going to be better at coding, which it is, don't waste your time trying to learn to do things that AI is just going to do so much better. Instead, learn other skills. For example, let's listen to these actual comments from Amjad. In the up case, like, you know, what Dario just said recently, all code will be AI generated. And
you know, I assume this optimization path we're on where agents are going to get better and better and better, you know, the answer would be different. The answer would be no. It would be a waste of time to learn how to code. But like, you know, you could have different predictions and I think different people will make different assumptions. I'm at this point like sort of
agent pill, I'm very bullish. Like, you know, I sort of changed my answer even like a year ago, I would say kind of learn a bit of coding. I would say learn how to think, learn how to break down problems, right? Learn how to communicate clearly, you know, with, you know, as you would with humans, but also with machines.
So effectively, he's saying, don't learn that old set of skills that we call coding. Pick other higher leverage skills instead, which start from the standpoint that AI is doing the coding. Hey, listeners, want to supercharge your business with AI?
In our fast-paced world, having a solid AI plan can make all the difference. Enabling organizations to create new value, grow, and stay ahead of the competition is what it's all about. KPMG is here to help you create an AI strategy that really works. Don't wait, now's the time to get ahead.
Check out real stories from KPMG of how AI is driving success with its clients at kpmg.us slash AI. Again, that's www.kpmg.us slash AI. Today's episode is brought to you by Vanta. Trust isn't just earned, it's demanded.
Whether you're a startup founder navigating your first audit or a seasoned security professional scaling your GRC program, proving your commitment to security has never been more critical or more complex. That's where Vanta comes in. Businesses use Vanta to establish trust by automating compliance needs across over 35 frameworks like SOC 2 and ISO 27001. Centralized security workflows complete questionnaires up to 5x faster and proactively manage vendor risk.
Vanta can help you start or scale up your security program by connecting you with auditors and experts to conduct your audit and set up your security program quickly. Plus, with automation and AI throughout the platform, Vanta gives you time back so you can focus on building your company. Join over 9,000 global companies like Atlassian, Quora, and Factory who use Vanta to manage risk and prove security in real time.
For a limited time, this audience gets $1,000 off Vanta at vanta.com slash NLW. That's V-A-N-T-A dot com slash NLW for $1,000 off. Today's episode is brought to you by Super Intelligent and more specifically, Super's Agent Readiness Audits.
If you've been listening for a while, you have probably heard me talk about this, but basically the idea of the Agent Readiness Audit is that this is a system that we've created to help you benchmark and map opportunities individually.
in your organizations where agents could specifically help you solve your problems, create new opportunities in a way that, again, is completely customized to you. When you do one of these audits, what you're going to do is a voice-based agent interview where we work with some number of your leadership and employees to map what's going on inside the organization and to figure out where you are in your agent journey.
That's going to produce an agent readiness score that comes with a deep set of explanations, strength, weaknesses, key findings, and of course, a set of very specific recommendations that then we have the ability to help you go find the right partners to actually fulfill. So if you are looking for a way to jumpstart your agent strategy, send us an email at agent at besuper.ai, and let's get you plugged into the agentic era. Many people, though, took the other side.
Martin Cassato from A16Z writes, I think you should learn to code. So what are some of the reasons that people might take that position? There are, I think, a couple different categories of reasons. One of them is the sort of esoteric coding isn't about the actual output. It's about helping you how to think. Dagster Labs founder Nick Schrock made that point saying Steve Jobs claimed that everyone should learn to code, not because it's useful or economically valuable, but because it teaches you how to think.
And certainly there's an argument that in a world where even more of our world is mediated by code, the particular genre of thinking that coding enables is even more valuable. There's also many flavors of the answer that learning to code is going to make you a better vibe coder. Anchor Goyal writes, AI multiplies the quality of code that someone would write without AI. Bad programmer times AI, lots of bad code. Medium programmer times AI, lots of medium code. Great programmer times AI, lots of great code.
Another argument is that there are just going to be some things that AI coding isn't as good at. Systems that need to get wired together. Dave Plummer writes, even if nobody writes code in five years, of course you should still learn to code. Why? I know how to write a Qsort. I've done that zero times, but I know when Qsort is appropriate. I know how to write a hash table, so I know when SDD map is appropriate. In five years, there will be prompt engineers and software engineers. And the best software engineers will remain the ones who could have written it by themselves, but didn't have to.
There are other arguments as well. On the innovation side, some people point out that coding tools are good at copying code but not good at innovating. So if we ever want new languages or think we need new languages in the future, that's likely not to come from AI. Another argument that's more temporary is just that these vibe coding tools aren't particularly good at enterprise settings yet, and so at least for some time that's an opportunity. And then there's the argument that if you don't think you should learn to code, you're just not thinking interestingly enough.
And it should be noted that there is an economic and jobs dimension of this. Despite his previous assertion that you should learn to code, Martin Cassato also tweeted recently, I was on a call a few nights ago with a number of very senior devs, and every one of them used AI to code and found it faster than working with junior devs. It's really worth thinking through what that means for the industry.
Basically, if there are no more junior developer jobs, does that mean you shouldn't learn to code because you're not going to be gainfully employed? Now, the counterargument to that, which I think bridges us into where I want to take the conversation, comes from Swix from Latent Space in the AI Engineer Summit, who writes, if you're thinking don't learn to code because nobody is hiring junior engineers anymore, you're missing the AI opportunity because you're tying your identity to an increasingly obsolete problem.
So what's my take? On the one hand, I think learning to code the traditional way would be absolutely insane right now. Or rather, learning to code with the expectation that the world looks anything like it does right now to say nothing of five years ago. No matter what you're thinking, you have to build your plans on the 100% assurance that things are going to be different. And I think that Dario is right and that the vast majority of the actual lines of code in the world are being written ultimately by AI.
So yes, learning to code to get a junior developer job seems a little insane right now.
On the flip side, I think that there is basically nothing higher leverage than you can be doing right now than learning this new vibe coding paradigm. Coding is digital creation, and that creation is now opened up to a hugely increased set of people. The sheer amount of things that are going to be created is about to increase massively, and in that is unfathomable opportunity. Yet that opportunity is still going to flow to the people who have the skills to actually harness it.
Although Andrej Karpathy is, I think, right when he said back in January of 2023 that the hottest new programming language was English, that doesn't mean that there aren't skills in using these new tools. Matt Bean writes, it's neither learn to code nor don't. False dichotomy. Coding 1.0 is dead. Coding 2.0 is being born. They're related, but that's it.
So what is Coding 2.0? Everywhere you look, you can see examples of people changing how they interact with not only code, but the world around them based on this new set of vibe coding tools. In Super Intelligent, when we are designing or discussing new tools, new features, or new platforms, we no longer do it with words, at least not exactly. Everyone on the team is expected to just go into Lovable and produce prototypes of what they're talking about.
Interested in building the premier marketplace matching enterprises and agents? Don't tell us, show us.
but it's also impacting hobby creation for me. For the last few years now, my favorite hobby has been Magic the Gathering cube design. A cube is basically a set of Magic the Gathering cards that you bring together that are meant to be drafted together, and there are an infinite number of ways to create them. You can create cubes based on particular sets or environments or themes that you like. So if I really like the horror sets from Innistrad, I could create a cube all about that. Where I'm no longer constrained by the game designer, I can bring together elements in my own way.
After a while, that got boring to me, so I started designing my own sets. I've been working on a set based on early American colonization and exploration, particularly around harvest time for the last couple of years now. But all of this was getting kind of boring until recently I started thinking about a game that would combine the drafting experience that I love from card games like Magic, with role-playing, with digital experiences...
and thought about H.P. Lovecraft as a setting. Well, last weekend to start testing out what the possibilities were, I used Lovable to build Eldritch Trail. It's a fully playable version of the original Oregon Trail, where instead of heading across to the Willamette Valley, you're trying to escape the horrors of Dunwich. You arrive in Arkham and go from there. I'll include a link to play this if you're interested.
Point being that even right now, we are living in this shift in real time. And so what I would be doing if I weren't building super is one, I'd be vibe coding all the time. I mean, heck, I'm already doing this.
Two, I'd be learning to read the code using LLMs as a tutor, of course, because right now I am absolutely constrained and limited in what I can and can't do based on just having to press the lovable button to try to fix it when an error comes up. I'd also probably go back and have AI help me learn the basics.
I'd want to move over time to being less reliant on the AI exclusively, even if it was still writing 99% of the code that I was ultimately producing. And of course, I would be trying to integrate these new platforms like the model context protocol that we discussed yesterday in order to be able to expand the set of things that I could actually produce.
This will be unsurprising to any of you who listen to my show regularly, but I think Reid Hoffman is right when he says, most junior developers will also be superpowered with AI tools. Presumably they can handle more workload. The right solution is likely not to downsize teams, instead figure out how to utilize the added capacity of the team.
I think in general, moving outside of the paradigm, can we do the same with less and into the paradigm of what more could we do with our new expanded capabilities is going to lead us to a better understanding of where the future is actually headed.
So yes, learn to code, but not the version that was available five years ago or three years ago. Not even really the version that's available today. You should be learning to code with the version of code that's happening six months from now, 12 months from now. To use an overly cliched phrase at this point, skate to where the puck is headed, man. And then send me the link to whatever you build with lovable or bolt. For now, that's going to do it for today's AI Daily Brief. Appreciate you listening or watching as always. And until next time, peace.