We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Vibe Coding Is The Future

Vibe Coding Is The Future

2025/3/5
logo of podcast Lightcone Podcast

Lightcone Podcast

AI Deep Dive AI Chapters Transcript
People
G
Gary Illyes
H
Harj
J
Jared
M
Mark Mandel
M
Melanie Warrick
P
Priyanka Vergadia
Topics
Gary Illyes, Jared, Harj, Diana: 我们觉得,“vibe coding”已经成为主流的编码方式,不适应这种变化的人可能会被打式。它会造成两种角色:一种是产品管理员,他们需要具有好的品位和产品意识;另一种是系统工程师,他们需要具有系统思维和建筑能力。在这个过程中,调试和具有好的品位是很重要的。当前的工具在调试方面还有很大的缺点,人仍然需要进行调试。使用以后的新型模型可能会改变这一状况。 Andrej Karpathy: “vibe coding”是一种新的编码方式,开发人完全涂涂在其中,拥抱指数的发展,甚至忘记代码本身的存在。 Outlet 的创始人: 软件工程师的角色将转变为产品工程师,人的品位将更加重要。 Abhi (Astra): 我很少写代码,我只是思考和审核。 Abhi (Copycat): 我现在对代码的依赖程度降低了,所以我对是否弃废或重构代码的决策也减少了偏差。 Yoav (K60): 我使用 Cursor 写全部内容,有时会同时开启两个 Cursor 窗口,并对他们提供不同的功能提示。 TrainLoop 的创始人: 编码速度在过去六个月内是指数的,我从工程师转变为产品人员。 Mark Mandel: LLM在调试方面表现很差,人仍然需要进行调试。 Melanie Warrick: “vibe coding”适合快速开发刚新品牌,但当产品达到市场和调的时候,仍需要强烈的系统工程能力来进行扩展。 Francesc Campoy: Facebook在初期使用 PHP,这一语言很差,但可以快速发布品牌。但在某些点上,它已经成为了发布品牌的大的异常,所以他们需要再新建立一个专有的编译器。 Priyanka Vergadia: 未接受传统培训的 AI原生工程师的品位很重要,如何培养他们的品位是一个重要问题。 Gary Illyes: 把整个代码库托入 Gemini 的长上下文窗口,并告诉它修好 bug,虽然不是每次都有效,但有时可以一次性修好。 Harj: Triplebyte公司目的是通过软件自动化评估软件工程师的技术能力。现在的评估问题应该考虑开发人如何使用代码生成工具。在评估工程师时,需要根据具体需求选择适当的评估方法,而不是一见见全。在 LLM 时代,工程师的技能评估需要考虑他们使用代码生成工具的能力。由于 LLM 可以简单地解答传统的编程问题,所以需要设计更复杂的评估题目来考核工程师的真正能力。

Deep Dive

Chapters
This chapter explores the concept of "vibe coding," a new approach to programming using LLMs, and examines its impact on software engineers. The discussion includes surveys of YC founders' experiences and opinions on the changing role of engineers in the AI era.
  • LLMs are transforming software development.
  • Vibe coding involves embracing LLMs and focusing on the product, not the code itself.
  • The role of software engineers is shifting towards product engineering.

Shownotes Transcript

Translations:
中文

It's like somebody dropped some like giant beanstalk seeds at night. We woke up in the morning. I mean, I think our sense right now is this isn't a fad. This isn't going away. This is actually the dominant way to code. And if you're not doing it, like you might just be left behind. Yeah.

Welcome back to another episode of the light cone. I'm Gary. This is Jared, Harj and Diana and we're partners at Y Combinator. Collectively, we funded companies worth hundreds of billions of dollars right when it was just an idea and a few people.

So today we're talking about vibe coding, which is from a Andrej Karpathy post that went viral recently. There's a new kind of coding I call vibe coding, where you fully give in to the vibes, embrace exponentials, and forget that the code even exists.

Yeah, so we surveyed the founders in the current YC batch to get their take on Vibe coding. And we essentially asked them a bunch of questions. We asked them, what tools are you using? How has your workflow changed? And so generally, where do you think the future of software engineering is going and how will the role of software engineer change as we get into YC?

a world of vibe coding. And we got like some pretty interesting responses. Anyone have any favorite quotes that jumped out from the founders? I think one of them that I can read verbatim is, I think the role of software engineer will transition to product engineer. Human taste is now more important than ever as Cogen tools make everyone a 10x engineer. That's from the founder of Outlet. Outlet.

I got one. Abhi from Astra said, "I don't write code much. I just think and review." This is like a super technical founder whose last company was also a DevTools company. He's extremely able to code, and so it's fascinating to have people like that seeing things like this. There's another quote from a different Abhi, Abhi from Copycat, who said, "I am far less attached to my code now, so my decisions on whether we decide to scrap or refactor code are less biased."

since I can code three times as fast, it's easy for me to scrap and rewrite if I need to. And then I guess the really cool thing about this stuff is it actually parallelizes really well. So Yoav from K60, he says, I write everything with Cursor. Sometimes I even have two windows of Cursor open in parallel, and I prompt them on two different features. Which...

Which makes sense. Why not three? You know, you can do a lot, actually. And I think another one that's great is from the founder of TrainLoop. He mentions how coding has changed six to one months ago. 10x speedup one month ago to now is 100x speedup, exponential acceleration. And he says, I'm no longer an engineer. I'm a product person.

Yeah, that's super interesting. I think like that might be something that's happening broadly. You know, it really ends up being two different roles you need. I mean, it actually maps to how engineers sort of self-assign today in that either you're, you know, front end or back end. And then back end ends up being about actually infrastructure. And then front end is so much more actually being a PM. You're sort of...

almost being like an ethnographer going into the obscure underserved parts of the pie of GDP. And you're trying to extract out like, this is what those people in that GDP pie actually want. And then I'm going to turn that into code. And then actually evals are the most important part of that. When I was running Triplebyte, this was actually one of the things we noticed. It

It was almost as important as a technical assessment of engineers when trying to figure out who's a good match for a specific company is there's a certain threshold of technical ability you need. But beyond that, it was, do you actually want to talk to users or not? Like,

Like some engineers are actually just very, a lot more motivated by working on things where they know who the users are and they get to like communicate with them and they get live feedback and they can iterate and essentially being a product engineer. And other engineers really don't want to do that at all. They find it annoying having to deal with users and they want to just like work on like hard technical problems and refactor code. That's a backend engineer. Yeah, that's what we call a backend engineer. Yeah, sure. And that's only a theme that came up in the survey responses, right? This idea of...

sort of the LLMs are maybe going to push people to choose, because the actual writing of the code may become less important. And it's about, do you have taste and you want to solve product problems? Or are you an architect and you want to solve systems problems? MARK MANDEL: Oh, and interestingly, I guess one thing the survey did indicate is that this stuff is terrible at debugging.

And so you still, the humans have to do the debugging still. They have to figure out, well, what is the code actually doing? Here's a bug. Where's, you know, spot the bug. Where's the code path that, you know, we have some, you know, logic error, you know, just didn't figure this out, right? There doesn't seem to be a way to just tell it debug. You were saying that you have to

be very explicit, like as if giving instructions to a first time software engineer. I have to really spoon feed at the instructions to get it to 2d bug stuff, or you can kind of embrace the vibes. I'd say Andre Carpati style is just like ignore the bug and just like reroll, just like

Just like tell it to try again from scratch. Like it's, it's wild how your coding style changes when actually writing the code becomes a thousand X cheaper. Like as a human, you would never just like blow away something that you'd worked on for a long time and rewrite it from scratch because you had a bug you'd you'd like, you'd always fix the bug.

But for the LLM, if you can just rewrite 1,000 lines of code in six seconds, why not? That's kind of like taking the approach of how people use Midjourney or Playground when you're trying to generate images. If there are artifacts or things that I don't like, sometimes I don't even change the prompt. I just click Reroll. And I do that five times. And sometimes it just works. I'm just like, oh, I can use that now. Which is very different.

frame of building systems because you're not building foundationally step by step. You're really doing it from scratch because fundamentally what's going on is like all these tools today are coming from the world of generated code that are in this latent space hidden somewhere and you have to do it from scratch to find like a different gradient and not get stuck. And then you want to like add a bit of randomness, get it to regenerate. But I do think

Maybe, I don't know, whatever next generation of 05, maybe we'll get to the point that actually is able to build upon. I mean, as of right now, I think most of it is you need to re-roll and re-write, but it doesn't build upon it yet. But we haven't seen any of the coding tools right now work well with reasoning. I think we have. Well, 03 is infinitely better at debugging than 3.5 Sonnet. So like it definitely feels like we're headed in the direction where this may not be true. Yeah.

So six months the next time we do this episode. Diana, do you want to talk about the models that people are using, the IDEs that the people are using? There's some really interesting trends there. DIANA ALVEAR: Yeah. I think as we mentioned a couple episodes ago, we already saw this in the shift started happening. The vibe started to shift back in summer '24 when cursor was being used by a big portion of the batch.

And now, by far, is the leader. But the other thing that's happening, this is a very fast and moving environment. Windsurf is a fast follower. It's starting to be a very good product as opposed to Cursor. And I think, Jared, you have some first experience with Windsurf.

why Windsurf is better than Cursor. Yeah. I think the number one reason that people are switching is that Cursor today largely needs to be told what files to look at in your code base. So if you have a large code base, you can tell what to do, but you have to tell it where to look in the code base. Windsurf indexes your whole code base and is pretty good at figuring out what files to look at on its own. There's other differences too, but I think that's the most important one of them.

Notable, Devin does get mentioned, but the drawback of Devin not really being used for serious features is that it doesn't really understand the code base.

is being used mostly for small features and barely, it's like barely mentioned. The other one, people still use ChatGPT. And the reason they use it is because they want to actually use the reasoning models. So it does get posted. People post some of the debugging questions to figure out the, use the more powerful models for reasoning. Because right now, Cursor, WithServe are still in the old world. I mean, old world less than six months ago of pre-reasoning models, not in the test time compute. So founders are using that.

And there's some founders, some of them are self-hosted as well, self-hosting models because maybe they have more critical, sensitive IP. They do that. And now talking about the shifts in terms of models. The thing about CodeGen, the big game in town that we saw six months ago was CloudSonic 3.5. It's still actually a big contender. Most are still using it.

But 01, 01 Pro, and 03 Mini, all these resumed models are starting to see a... It's almost like getting neck-to-neck now close with a Sony 3.5. The other one is 4.0, virtually no use for Kojan. And the other interesting thing is DeepSeek R1.

is getting mentioned. Spin use is like a viable contender as well. And Gemini, not really mentioned. The one thing I've heard from Gemini is because it has like the longest context window, I've heard from a couple of founders that they do use it. And the way that they use it is they put their entire code base into the Gemini context window and they just like tell it to like

fix a bug and it doesn't always work, but like sometimes it can just like one shot fixed up because it's the whole thing in the context window. It will be interesting to see as people get more adoption on the newly released reasoning models with Flashback 2.0. I don't think people have tried it yet.

But the long context window plus reasoning could be a good contender. What is the estimated code that's being written by LLMs in the current Badger? Hi, this is pretty crazy. So we explicitly asked this question, what percent of your code base do you estimate is AI generated? The way I interpreted the question is like of the actual characters in your code base, not including any libraries that you imported, like a percentage of the characters were typed by human hands versus emitted by an LLM.

And the crazy thing is one quarter of the founders said that more than 95% of their code base was AI generated, which is like an insane statistic.

And it's not like we funded a bunch of non-technical founders. Every one of these people is highly tactical, completely capable of building their own product from scratch. A year ago, they would have built their own product from scratch, but now 95% of it is built by an AI. Except for maybe, it sounds like we have one or two examples of people who they're so young that they learned to code in the last two years. Yeah.

So they actually don't know a world where Cursor didn't exist. Yeah. This is one of my best companies, Dispatch, actually is exactly this. The founders are extremely technical minds, but they're not classically trained in computer science and programming. And they are incredibly productive and able to produce just a ton of like really amazing product. And AI is writing almost the entire thing. It kind of makes me think a lot of...

The discourse around sort of Gen Zs are the first digital native that grew up with the internet. This is like the generation that grew up with native AI coding tools that they skip the classical training of a software engineer and they just do it with the vibe. But they are actually very technical minded. I mean, they have degrees in math and physics. Math and physics, yeah. So they have that raw...

let's call it more like system thinking type of mind that you still need. Maybe we should talk a bit about that. It's like, what's still the same and what has changed? I think this vibe coding will enable people who have those kinds of technical minds who come from other technical disciplines like math and physics

to become highly productive as programmers much faster than it was in the past. Like I remember there were like coding bootcamps, like back in the day they would try to like retrain physics people into programmers and then like it didn't work that well because it just takes too long to learn all of the syntax and all of the libraries and all the stuff that you have to know to be really productive. But like now,

Now it's a new world. The coding bootcamps are also very specifically focused on getting you hired at companies. And I think there was, it was during this is around like 2015 era where just companies themselves are rethinking

how to evaluate software engineers and their hiring processes. And it was moving. There was a real shift away from like, we want to hire classically trained computer scientists, whiteboard algorithmic problems towards we actually want people who are just really productive and write code quickly. And some of these arguments are like ever

evergreen eternal, right? Like I remember when Rails first came out, there was just like a real sense of, oh, like, I don't know, like active record as a way to like interact with your database was seen as a great abstraction, but like there was still this same flavor of argument, right? Like, I don't know, like if you don't really understand the internals, like you're just going to write like crappy, low-performing web software. How do you feel those arguments have...

aged if you look back on it now my feeling is that many of the most successful companies i would say stripe gusto are just two that really spring to my mind as ones that really heavily leaned into the actually we just want people who are really productive with the tools and we're going to

change our whole hiring process to just select for people who are good at like the interview shifted from teach us how you think to you've got three hours on the laptop and you need to build a to-do list app and build it as quickly as you can and those companies have had a tremendous amount of success it does it does seem like at some point as they grew and they scaled then the bottleneck did actually become having people who are classically trained and systems thinkers to sort of

scale up and architect things. It does seem like how people are hiring engineers is changing, but maybe not changing fast enough yet. The results of the survey are relatively surprising to the four of us here. Probably pretty shocking out there.

It's just like this thing that popped up in our backyard only in the last six to nine months. My guess would be engineering hiring period has not actually caught up to this. People are still standing at whiteboards and doing that kind of thing as opposed to what can you get done. And so it sounds like the stripes of the world, they were ahead of the game and everyone has to hire engineers this way now. I mean, I wonder if actually even that's going to be sort of the old way

meta. I mean, something that stood out from the survey responses was this idea of two themes we talked about, right? Like one is how, okay, we're all just product people now. Like actually the thing that you need is really great taste and understand what to build. And the second was the idea of actually now what's really valuable is to be like a systems thinker and an architect and to really sort of understand the bigger picture. In which case,

Actually, maybe being a really productive coder, because that's definitely something that always fit my definition of when you're talking about who are great engineers you know, and they're one of the dimensions, they can write code really fast. Maybe that's outdated. If the LLMs are actually really good at writing code quickly, and to your point, now it's actually just cheaper to re-roll and just write everything from scratch than try and debug.

The skills might just be completely different. The problem is there's two different stages. There's zero to one, in which case speed is the only thing that matters. And then to your point about active record and rails, that balance...

battle was actually fought to a standstill because of course using active record or rails allowed you to go from zero to one very quickly but then what happened to Twitter it became the fail whale right like basically once you get to one like one you know that architecture yeah will not get you to a billion or ten billion or a hundred billion dollars in valuation or users or whatever like it's just not going to actually work so I think you're gonna see a

the same thing. And then there's nuance in what you just said, right? Like getting zero to one quickly and then being able to scale to a billion users are two totally different sets. And then I think that that might be irreplaceable for now around people. And one of the things I discovered as we were scaling one of the biggest rail sites, I mean, you scaled one of the biggest rail sites too, is that there aren't that many people who have to do it.

Getting to one is so rare. MARK MANDEL: Yeah, that's true. MARK MIRCHANDANI: Did you reach this point where how you got to zero to one very quickly was like you used lots and lots of open source. And then at some point, like maybe two years into this, maybe a year and a half into the startup even, we could not use random gems anymore. MARK MANDEL: Gems anymore because they were just never designed for companies at our scale. And so we had to like-- GARY ILLYES: Like deploy it and it would fall over. MARK MANDEL: --go their own stuff. Yeah. GARY ILLYES: It would just fall over. MELANIE WARRICK: This very good example is what you're saying, Gary. I think maybe to summarize a bit more,

Zero to one will be great for vibe coding where founders can ship features very quickly. But once they hit product market fit, they're still going to have a lot of really hardcore systems engineering where you need to get from the one to N and you need to hire very different kinds of people. And to that, I think there's very good historical example like Facebook as well. I mean, they got- PHP. Yeah, they got away with PHP, which personally is a terrible language. Maybe I got flamed, but-

I think it's a bad language. MARK MANDEL: I'm with you. I never liked it. Sorry, guys. FRANCESC CAMPOY: It was very bad. But you could ship things very quickly. But at some point, it became such a big bottleneck for them to ship features that they had to hire the hardcore system people to build a custom compiler, HipHop. MARK MANDEL: Yeah.

that it would run fast on bare metal because it was just too expensive to replace all the code. And that kind of people that did that are not the Vibecode people, are sort of this hardcore systems people that, based on our survey, current tools are not good at that low-level systems engineering. Harjit, I'm not sure everybody who's listening knows what Triplebyte is, but it's actually very relevant. Do you want to just describe for everybody what... Yeah, Triplebyte was the company I started in 2015, and our...

we were essentially building a technical assessment for engineers. Like our goal was how can you use software to automate evaluating software engineers? And the way we did it was pre all these code gen models, which is we built all of our own custom software to interview engineers, have humans interview engineers, and then essentially just like label the data like in. And interview them by like asking them to write code. They, they, it was a highly tactical interview. Yeah, it was a high tech. Yes. Like it was asking him to write code, um,

We did actually include algorithmic problems. MARK BLYTH: And is it true that you and your co-founders have done more technical interviews than any other people on the planet? MARK BLYTH: I think so, in terms of just pure hours. Because that was just the early days of it, where all day, every day, just interviewing people. MARK BLYTH: Like thousands and thousands of them, right? MARK BLYTH: And then we scaled up, and we had a team of about 100 engineers contracted just

that we would pay per interview completed. And so you're exactly the right person to ask this question because you've literally spent more time thinking about this than anyone else on the planet. If you were starting Triple Byte again today and you had to design like new tactical assessments for engineers, what would you have them do?

The big takeaway I have with Triplebyte and the screen in particular is just people want different things. And so you kind of need to know upfront what exactly it is that you're evaluating for and then design your technical screen

around that. It's kind of what I'm getting at with where Stripe and Justo and these companies just knew that they didn't care if someone had fundamental CS knowledge, so it didn't make sense to screen them on that. They wanted to screen for the thing they were actually going to do in their job. And then our product was more trying to screen for everything, trying to get a taste of everything

companies might want and then figure out what someone's max skill was and then send the people with the max skill to the companies that would value that max skill. And in today's world, I think I would actually have a screen that

at least accounted for just how well people knew how to use these tools. So again, it's maybe contrary to what I'm saying earlier, but it might be the case of maybe how quickly you can code and you can build product is actually something to explicitly screen on just at the

bar much higher. You probably have to ask different questions because I'll bet if you go back to the original triple byte assessment, I'll bet a lot of those questions you could literally just copy and paste the question into chat GPT and it would spit out a perfect answer. In which case you're not really proving that much competence if you're just copying. The questions probably have to be like a hundred times harder. This gets to the deepest stuff, right? Not necessarily because if you have someone else it depends on what conditions you're going to put on the screen, which I think is interesting. A classic question was build tic-tac-toe.

Yes, of course. If you do that unsupervised and you just let someone come back with their tic-tac-toe solution, that's going to take two seconds. If you want to watch them code it and force them to not use an LLM... I guess that's a question. Do you force them to code it without an LLM? With the old questions? Or do you let them use an LLM and now you need new questions because the old ones became trivial? That, I think, is what...

everyone hiring software engineers right now should be thinking about and trying to figure out. Yeah, I'm not sure I know what the correct answer is to that. Yeah. I think there's going to be-- probably you're going to test for different things, because they also did a lot of engineering hiring.

I think one key skill that's going to, I think, remain that's constant, I do think skills of reading code and debugging are maximum. It's like you have to have the taste and enough training to know that the LLM is spitting bad stuff or good stuff. So like bad code or good. And I think you can see it clearly sometimes if a candidate is using the tools and they

There's actually a reasonable solution at the LLM outputs and then they can decide, oh, this is actually bad. That is a sign. So I think knowing kind of more the high level thinking to know what is good versus bad. In order to do good vibe coding, you still need to have the taste and you still need that kind of classical, maybe not necessarily classical train, but enough knowledge to judge what's good versus bad. And you only become good with enough practice. I think that will be one that will be constant. That would be my opinion.

MARK MANDEL: Yeah, that's interesting. Just code-- it's more like code review as the interview versus actually producing code. PRIYANKA VERGADIA: Yeah. I mean, you could have some form of system design. You want to know how good they can put a product out there. So then it's testing for taste. So we're going to test for debugging and then taste. But then how do you get to-- I guess this is a question going to these kits that you have that we call it AI coding natives. MARK MANDEL: Yeah. PRIYANKA VERGADIA: How do you develop taste

When you don't come from a classically trained world, which would be interesting for next generation founders. Well, you have to because if you don't, the startup dies, right? So let's say this founder, they go off, they have 95% written by AI. The proof is in a year out, two years out, they have 100 million users on that thing.

you know, does it fall over or not? And then one of the things that's pretty clear is these systems, you know, in the first realm, the first versions of reasoning models, they're not that good at debugging. So you actually would need to descend down into the depths of what's actually happening. And if you can't, then you got, I mean, let's hope that they can go find another architect. They're going to have to hire someone who can. I think there's going to be a generation of

software engineers that are like good enough because it's so easy to retool there with all these cogen tools like the barrier is so low you're going to be good enough engineers there's going to be tons of those

But to be exceptional, like the top 1%, I think you're going to need to get into deliberate practice. I mean, the analogy we're talking about is Mark Malkin Gladwell popularized this concept of 10,000 hours of practice to become an expert, which came from this research from, what was his name? Anders Ericsson. Anders Ericsson, right? Which it wasn't just the research was very specific. It was about...

How do you find world-class violinists? And it wasn't about just putting the time, but deliberate practice. It's like hours that are actually planned and thought and it's hard work. You could become an expert with less hours. So I think what's happening with now Cogen tools is that it's very cheap to put in the hours because the output is just so quickly. You can get to good enough and

But to become the best in the world and the best founder, you're going to need that deliberate practice to go into the details. And you're going to have to peel the onion and understand the systems and get to, again, to some extent being classically trained. I mean, a good example is maybe we go back to history. It's like Picasso, one of the greatest painters ever.

It was amazing at drawing lifelike pictures. Which is not what he's famous for. Of course, when you imagine a Picasso, you imagine the opposite of that. Yeah. There's this famous sequence of drawings on how he got to an abstract bowl. It starts from being lifelike to iterations until he gets to the essence to kind of the abstract art that he's very well known for. But he could only get to be the best in the world because he was actually a very good

painter and classically trained, I could draw super well. But that's not what he's known for. So I do think we'll see these two classes of engineers. You'll still have like a very fat class of like good enough. You need engineers for those.

but the best in the world, the founders that become outliers are going to need to put in the deliberate practice. Yes and no. I mean, I think there are lots of really amazing examples of great systems level and world-class engineers who ended up being CEO and CEO of the biggest public companies in the world. I think of Max Levchin. I think of Toby Lutke from Shopify. I mean, these are people who just like actually that great.

And the thing is, there are lots of other people who are not that great, but also still CEO or co-founders of companies. And then it kind of goes back to link up what we were saying earlier. It goes back to hiring. I mean, I keep thinking about the Twitter analogy that you brought up. So I think it's a really interesting one. If you compare Facebook and Twitter, in both cases, they went very quickly from zero to one in sort of scrappy move fast, break things way.

Facebook was able to solve the scaling technical challenges in a pretty impressive way. I think most people would agree. I mean, Mark Zuckerberg was...

by far way more technical and way more in the weeds probably. Maybe, but I don't know. I think Twitter scalability challenges were also harder based on the usage patterns. The thing about the usage of Facebook is that it's pretty smooth throughout the day. People just use it all the time. The problem with Twitter is that the usage is incredibly spiky. You get a Super Bowl or a world event and all of a sudden you have 10 times as much usage

The way the fan out of the feed works is, I think, like fundamentally a very difficult computer science problem. OK, that's fair. Though I also think that they were like really hamstrung by their tools. Do you remember using this terrible queue system called Starling? Absolutely. I used it because I thought, oh, Twitter is so much bigger than us. They're so smart. They wouldn't use something that's crap.

No, they totally use crap. And then I use crap and I couldn't make it work. It was like it was dropping jobs on the floor. Like it's just like all these crazy bugs happened. And then I was like, finally, I was like, I'm not using that anymore. I have to switch to RabbitMQ or whatever the heck the actually correct thing to use was. Yeah. And like Ruby is an incredibly slow language, even like 10x slower than PHP, which was already too slow.

So, I don't know. I mean, basically you should be so lucky to get to one. Yeah. Is there an advantage for a technical founder to be classically trained and be a really deep systems thinker? Well, I mean, you just, I mean, a Toby or a Max Levchin is not going to get bullshitted by people. Patrick from Stripe is the same. I mean, I'll tell you a crazy story. When I was...

At Palantir, I sort of burnt out there after a couple of years after I designed the logo. And then I actually, between that and going to start my YC startup, I spent six months as an interaction designer. And I was at this like...

terrible venture-backed company that ended up going in the ground and it was like credit card software is the worst I spent six months building like basically just interaction designs which is really fun that's what allowed me to work on my startup in my spare time because I had a lot of spare time but I remember designing you know this faceted search thing for like rental cars or something like that and I go into my meeting with my dev manager and engineers who are going to implement it and

They were like, oh, yeah, it can't be done. We can't do it that way. Oh, like... And I was like, what are you talking about? Just make the indexes like this. And they were like, whoa! What do you mean? And then they looked up my resume. Didn't expect to hear that from your interaction designer. Yeah, basically. They're like, how did you know that? And I was like, you fucking lied to me. And that's the thing. Like, what founders... And, like, you know, when you're hiring people, like, that was, like, the wildest thing to me. But it was, like, sort of the...

Yeah.

And then the worst part is like, you kind of have to call them on it. Like, you know, sometimes you have workplace cultures that are so polite that people are like, oh, like, I'm going to let that pass. And then I'm going to talk shit about them behind their back. And it's like, no, you should fire them. The AI agents, incidentally, will do exactly the same thing. They will absolutely like, the AI agents will bullshit you just like a human employee will if you don't, like, if you're not technical enough to like call them out on their bullshit and be like, like, no, like, you didn't make the change that I had.

It goes back to your point about why being classically trained is still helpful. You have to be able to call out all the people working for you, whether they're human or not. Being technical enough to be able to do that is a superpower. So just to wrap up, basically what's going on with all these tools giving superpowers to the best engineers and making the bad engineers also worse is, this is a quote from the founder of Trainloop, how coding has changed six to one month ago, 10x speed up. Now, one month ago,

100x speed up. Now it's exponential acceleration. It sort of crept up on us, actually. Yeah, it was like somebody dropped some giant beanstalk seeds at night. We woke up in the morning. I mean, I think our sense right now is this isn't a fad. This isn't going away. This is actually the dominant way to code. And if you're not doing it,

Like, you might just be left behind. This is just here to stay. And, you know, vibe coding is not a fad. It's time to accelerate. So with that, we'll see you guys for the next Light Cone.