DHH on vibe coding with AI: It makes you dumber | Lex Fridman Podcast Clips - https://www.youtube.com/watch?v=6i5hvNA72ZU Probably argue about this, but I really like working together with AI, collaborating with AI. And I would argue that the kind of code you want AI to generate is human readable. Human interpretable, yes. If it's generating Pearl golf code, it's just. It's not a collaboration. So it has to be speaking the human. It's not just you're writing the prompts in English. You also want to read the responses in the human interpretable language, like Ruby. Right. So that's actually is beneficial for AI too, because you kind of said that for you, the sculptor, the sort of the elitist Coco Chanel sculptor, you want to, on your fancy keyboard, to type every single letter yourself with your own fingers. But it's also that the benefit of Ruby also applies when some of that is written by AI and you're actually doing with your own fingers the editing part, because you can interact with it because it's human interpretable. The paradigm I really love with this was something Elon actually said on one of your shows when you guys were talking about neuralink. That neuralink allows the bandwidth between you and the machine to increase. That language, either spoken or written, is very low bandwidth if you are to calculate just how many bits we can exchange as we're sitting here very slow. Ruby has a much higher bandwidth of communication. Reveals conveys so much more concept per character than most other programming languages do. So when you are collaborating with AI, you want really high bandwidth. You want it to be able to produce programs with you, whether you're letting it write the code or not, that both of you can actually understand really quickly and that you can compress a grand comp or a grand concept, a grand system, into far fewer parts that both of you can understand now. I actually love collaborating with AI too. I love chiseling my code. And the way I use AI is in a separate window. I don't let it drive my code. I've tried that. I've tried the cursors and the windserves. And I don't enjoy that way of writing. And one of the reasons I do enjoy that way of writing is I can literally feel competence draining out of my fingers. Like that level of immediacy with the material disappears. And where I felt this the most was I did this remix of Ubuntu called Omicube when I switched to Linux. And it's all written in Bash. I'd never written any serious amount of code in BASH before, so I was using AI to collaborate, to write a bunch of BASH with me, because I Needed all this. I knew what I wanted. I could express it in Ruby. But I thought it was an interesting challenge to filter through bash, because what I was doing was setting up a Linux machine. That's basically what BASH was designed for. It's a great constraint. But what I found myself doing was asking AI for the same way of expressing a conditional, for example, in Bash, over and over again, that by not typing it, I wasn't learning it, I was using it, I was getting the expression I wanted, but I wasn't learning it. And I got a little scared. I got a little scared. Is this the end of learning? Am I no longer learning if I'm not typing? And the way I, for me recast that was. I don't want to give up on the AI. It is such a better experience as a programmer to look up APIs, to get a second opinion on something, to do a draft. But I have to do the typing myself because you learn with your fingers. If you're learning how to play the guitar, you can watch as many YouTube videos as you want. You're not going to learn the guitar. You have to put your fingers on the strings to actually learn the motions. And I think there is a parallel here to programming, where programming has to be learned in part by the actual typing. I'm just really. This is. This is fascinating. Listen, part of my brain agrees with you 100%. Part doesn't. I think AI should be in the loop of learning. Now, current systems don't do that, but I think it's very possible for cursor to say, to basically force you to type certain things. So, like, if you set the mode of learning. I just, I don't want to be this sort of give up on AI. I really, I think. I think Vibe coding is a skill. So for an experienced programmer, it's too easy to dismiss vibe coding as a thing. I agree. I wouldn't dismiss it, but I think you need to start building that skill and start to figure out how do you prevent the competency from slipping away from your fingers and brain? Like, how do you develop that skill in parallel to the other, to the other skill? I don't know. I just. I think it's a fascinating puzzle, though. I know too many really strong programmers that just kind of avoid AI because it's currently a little too dumb. Yes, it's a little too slow is actually my main problem. It's a little too dumb in some ways, but it's a little too slow in other ways. When I use Claude's code The terminal version of claude, which is actually my preferred way of using it. I just. I get too impatient. It feels like I'm going back to a time where code had to compile and I had to go do something else. Boil some te while the code is compiling. Well, I've been working in Ruby for 20 years. I don't have compile wait in me anymore. So there's that aspect of it. But I think the more crucial aspect for me is I really care about the competence. And I seen what happens to even great programmers the moment they put away the keyboard. Because even before AI, this would happen as soon as people would get promoted. Most great programmers who work in large businesses stop writing code on a daily basis because they simply have too many meetings to attend to, they have too many other things to do and invariably they lose touch with programming. That doesn't mean they forget everything. But if you don't have your fingers in the sauce, the source, you are going to lose touch with it. There's just no other way. I don't want that because I enjoy it too much. This is not just about outcomes, this is what's crucial to understand. Programming for programmers who like to code is not just about the programs they get out of it. That may be the economic value. It's not the only human value. The human value is just much in the expression. When someone who sits down on a guitar and plays Stairways to Heaven, there's a perfect recording of that that will last in eternity. You can just put it on Spotify. You don't actually need to do it. The joy is to command the guitar yourself. The joy of a programmer, of me as a programmer, is to type the code myself. If I elevate myself, if I promote myself out of programming, I turn myself into a project manager. A project manager of Murder of AI Crows. As I wrote the other day, I could have become a project manager my whole career. I could have become a project manager 20 years ago if I didn't care to write code myself. And I just wanted outcomes. That's how I got started in programming. I just wanted outcomes. Then I fell in love with programming and now I'd rather retire than giving it up now. That doesn't mean you can't have your cake and eat it too. I've done some vibe coding where I didn't care that I wasn't playing myself. I just wanted to see something. There was an idea in my head, I wanted to see something. That's fine. I also use AI all day long. In fact, I'M already at the point where if you took it away from me, I'd be like, oh my God, how do we even look things up on the Internet anymore? Is stack overflow still around forums still a thing? Like, how do I even find answers to some of these questions I have all day long? I don't want to give up AI. In fact, I'd say the way I like to use AI, I'm getting smarter every day because of AI. Because I'm using AI to have it explain things to me, even stupid questions. I would be a little embarrassed to even enter into Google. AI is perfectly willing to give me the Eli5 explanation of some UNIX command. I should have known already, but I don't. I'm sorry, can you just explain it to me? And now I know the thing. So at the end of the day of me working with AI all day long, I'm a little bit smarter. Like 5%. Sorry, not 5%. Half a percent. Maybe that compounds over time. But what I've also seen, when I worked on the Yamaku project and I tried to let AI drive for me, I felt I was maybe half a percent dumber at the end of the day. Okay, you said a lot of interesting things. First of all, let's just start at the very fact that asking dumb questions. If you go to Stack Overflow and ask a dumb question or read somebody else's dumb question and the answer to it, there's a lot of judgment there. Yes. AI sometimes to an excessive degree, has no judgment. It usually says, oh, that's a great question to a fault. Yeah. Oh, that's wonderful. Yeah. I mean, it's so conducive to learning. It's such a wonderful tool for learning and I would, I too would miss it. And it's a great, basically, search engine into all kinds of nuances of a particular programming language, especially if you don't know it that well or like APIs you can load in documentation. It's just so great for learning. I, for me personally, it. I mean, on the happiness scale, it makes me more excited to program. I don't know what that is exactly. Part of that is the. I'm really sorry. Stack Overflow is an incredible website, but there is a negativity there, there's a judgment there, there's. It's just exciting to be like a. With a hype man next to me, just like saying, yeah, that's a great idea. And I'll say, no, that's wrong. I'll correct the AI and the AI will Say, you're absolutely right. How did I not think about that? Rewrite the code. I'm like, holy shit, I'm having. It's like a buddy that's like really being positive and is very smart and is challenging me to think. And even if I never use the code it generates, I'm already a better programmer. But actually the deeper thing is for some reason I'm having more fun. That's a really, really important thing. I like to think of it as a pair programmer for exactly that reason. Pair programming came of vogue in like the 2000s where you'd have two programmers in front of one machine and you'd push the keyboard between you. One programmer would be driving, they'd be typing in. The other programmer would essentially sit and watch the code, suggest improvements, look something up. That was a really interesting dynamic. Now unfortunately I'm an introvert, so I can do that for about five minutes before I want to jump off a bridge. So it doesn't work for me as a full time occupation. But AI allows me to have all the best of that experience all the time. Now I think what's really interesting, what you said about it makes it more fun. I hadn't actually thought about that. But what it's made more fun to me is to be a beginner again. It's made me, it made it more fun to learn Bash successfully for the first time. Now I had to do the Detour where I let it write all the code for me and I realized I wasn't learning nearly as much as I hoped I would and that I started doing once I typed it out myself. But it gave me the confidence that, you know, what if I need to do some iOS programming myself? I haven't done that in probably six years, was the last time I dabbled in it. I never really built anything for real. I feel highly confident now that I could sit down with AI and I could have something in the App Store by the end of the week. I would not have that confidence unless I had a pair programming body like AI. I don't actually use it very much for Ruby code. I'm occasionally impressed whenever I try it, they're like, oh, it got this one thing right that is truly remarkable. And it's actually pretty good. And then I'll ask it two more questions and I go like, yeah, okay. If you were my junior programmer, I'd start tapping my fingers and going like, you gotta, you gotta shape up now. The great thing of course is we can just wait five minutes. The anthropic CEO seems to think that 90% of all code by the end of the year is going to be written by AI. I'm more than a little bit skeptical about that, but I'm open minded about the prospect that programming potentially will turn into a horse when done manually, something we do recreationally. It's no longer a mode of transportation to get around la. You're not going to saddle up and go to the grocery store and pick up stuff from Whole Foods and your saddlebags. That's just not a thing anymore. That could be the future for programming, for manual programming. Entirely possible. I also don't care. Even though we have great renditions of all the best songs, as I said, there are millions of people who love to play the guitar. It may no longer have as much economic value as it once did. I think that I'm quite convinced is true, that we perhaps have seen the peak. Now, I understand the paradox. When the price of something goes down, actually the overall usage goes up and total spend on that activity goes up. That could also happen, maybe. But what we're seeing right now is that a lot of the big shops, a lot of the big companies are not hiring like they were five years ago. They're not anticipating they're going to need tons more programmers. Controversially, Toby actually put out a memo inside of Shopify asking everyone who's considering hiring someone to ask the question, could this be done by AI? Now he's further ahead on this question than I am. I look at some of the coding trenches and I go like, I'd love to use AI more. And I see how it's making us more productive, but it's not yet at the level where I just go like, oh, we have this project, let me just give it to the AI agent and it's going to go off and do it. But let's just, let's just be honest. You're like a Clint Eastwood type character, cowboy on a horse. Seeing, seeing cars going around, you're like, well, that's part of it. And I think that's. It is important to have that humility that what you are good at may no longer be what society values. This has happened a million times in history that you could have been exceptionally good at. Saddle making, for example. That's something that a lot of people used to care about because everyone rode a horse and then suddenly riding a horse became this niche hobby that there's some people care about it, but not nearly as many. That's okay. Now the other thing of this is I've had the good fortune to have been a programmer for nearly 30 years. That's a great run. I try to look at life in this way that I've already been blessed with decades of economically viable, highly valuable ways of translating what I like best in the working world to write Ruby code that, that was so valuable that I could make millions and millions of dollars doing it. And if that's over tomorrow, I shouldn't look at that with regret, I should look at it with gratitude. But you're also a highly experienced, brilliant and opinionated human being. So it's really interesting to get your opinion on the future of the horse because it, you know, there's a lot of young people listening to this who love programming or who are excited by the possibility of building stuff with software, with the Ruby, with Ruby, with Ruby on Rails, that kind of language, this and now the possibility. Is it a career? Is it a career? And how, if indeed a single person can build more and more and more with the help of AI, like how do they learn that skill? Is this a good skill to learn? I mean, that to me is the real mystery here because I think it's still absolutely true that you have to learn how to program from scratch. Currently, yes. But how do you balance those two skills? Because I too, as I'm thinking now, there is a scary slipping away of skill that happens in a matter of like really minutes on a particular piece of code. It just, it's, it's, it's scary. Not the way driving, you know, when you have a car drive for you doesn't quite slip away that fast. So that really scares me. So when somebody comes up to me and asks me like, how do I learn to program? I don't know what the advice is because I think it's not enough to just use cursor or copilot to generate code. It's absolutely not enough. Not if you want to learn. None of you want to become better at it. If you just become a tap monkey, maybe you're productive in a second. But then you have to realize, well, can anyone just tap? If that's all we're doing is just sitting around all day long tapping? Yes, yes, yes, yes, yes. That's not a marketable skill. Now, I always preface this both to myself and when I speak to others about it is rule number one, nobody fucking knows anything. No one can predict even six months ahead. Right now we're probably at peak AI future hype because we see all the promise because so much of it is real and so many People have experienced it themselves. This mind boggling thing that the silicon is thinking in some way that feels eerily reminiscent of humans. I'd actually say the big thing for me wasn't even chatgpt, it wasn't even Claude. It was deep seek. Running deep seek locally and seeing the think box where it converses with itself about how to formulate the response, I almost wanted to think, is this a gimmick? Is it doing this as a performance for my benefit? But that's not actually how it thinks. If this is how it actually thinks, okay, I'm a little scared. Yeah, this is incredibly human how it thinks in this way. But where does that go? So in 95, one of my favorite movies, one of my favorite B movies came out. The Lawnmower Man. Great movie, Incredible movie about virtual reality, being an avatar and living in VR. Like the story was a mess, but the aesthetics the world had built up was incredible. And I thought, we're five years away. I'm going to be living in VR now I'm just going to be floating around. I'm going to be an avatar. This is where most humans can spend most of the day. That didn't happen. We're 30 years later. VR is still not here. It's here for gaming. It's here for some specialized applications. My oldest loves playing gorilla tag. I don't know if you've tried that. That's basically the hottest VR game. Wonderful. That's great. It's really hard to predict the future because we just don't know. And then when you factor in AI and you have even the smartest people go like, I don't think we fully understand how this works. But then on the flip side, you have Moore's Law that seems to have worked for many, many, many years in decreasing the size of the transistor, for example. So like, you know, Flash didn't take over the Internet, but Moore's Law worked. So we don't know which one AI is. And this is what I find so fascinating too. I forget who did this presentation, but someone in the web community, this great presentation on the history of the airplane. So you go from the Wright brothers flying in what is 1903 or something like that, and 40 years later you have jet flight. Just an unbelievable amount of progress in four decades. Then in 56, I think it was the hall design for the Boeing 747, essentially precursor was designed and basically nothing has happened since. Just minor tweaks and improvements on the flying experience since the 50s. Somehow if you were to predict where flying was going to go. And you were sitting in 42 and you'd seen, you'd remember the Wright brothers flying in 03 and you were seeing that jet engines coming. You're like, we're going to fly to the stars in another two decades. We're going to invent super mega hypersonic flights that's going to traverse the Earth in two hours. And then that didn't happen. It tapped out. This is what's so hard about predicting the future. We can be so excited in the moment because we're drawing a line through early dots on a chart, and it looks like those early dots are just going up and to the right and sometimes they just flatten out. This is also one of those things where we have so much critical infrastructure, for example, that still runs on cobol, that about five humans around the world really understand truly, deeply that there is a lot it's possible for society to lose a competence it still needs because it's chasing the future. COBOL is still with us. This is one of the things I think about with programming. Ruby on Rails is at such a level now that in 50 years from now, it's exceedingly likely that there's still a ton of Ruby on Rails systems running around now. Very hard to predict what that exact world is going to be like. But yesterday's weather tells us that if there's still cobalt code from the 70s Operating Social Security today, and we haven't figured out a clean way to convert that, let alone understand it, we should certainly be humble about predicting the future. I don't think any of the programmers who wrote that COBOL code back in the 70s had any damn idea that in 2025 checks were still being cut off the business logic that they had encoded back then. But that just brings me to the conclusion on the question for what should a young programmer do? You're not going to be able to predict the future. No one's going to be able to predict the future. If you like programming, you should learn programming now. Is that going to be a career forever? I don't know. But what's going to be a career forever? Who knows? Like a second ago we thought that it was the blue collar labor that was going to be abstracted. First it was the robots that were going to take over. Then Gen AI comes out and then all the artists suddenly look like, holy shit, is this going to do all animation now? Is going to do all music? Now they get real scared. And now I see the latest Tesla robot going like, oh, Maybe we're back now to blue collar being in trouble because if it can dance like that, it can probably fix a toilet. So no one knows anything and you have to then position yourself for the future in such a way that it doesn't matter that you pick a profession or path where if it turns out that you have to retool and reskill, you're not going to regret the path you took. That's a general life principle for me. How I look at all endeavors I involve myself in is I want to be content with all outcomes. When we start working on a new product at 37signals, I set up my mental model for its success and I go, do you know what? If no one wants this, I will have had another opportunity to write beautiful Ruby code, to explore greenfield domain, to learn something new, to build a system I want even if no one else wants it. What a blessing, what a privilege. If a bunch of people want it, that's great. We can pay some salaries, we can keep the business running and if it's a blow away success, wonderful. I get to impact a bunch of people. I think one of the big open questions to me is how far you can get with vibe coding. Whether an approach for a young developer to invest most of the time into vibe coding or into writing code from scratch. So vibe coding meaning so I'm leaning into the meme a little bit, but vibe coding meaning you generate code, you have the, this idea of a thing you want to create, you generate the code and then you fix it with both natural language to the prompts and manually. You learn enough to manually fix it. So that's the learning process, how you fix code that's generated or you, you write code from scratch and have the LLMs kind of tab, tab, tab tab, add extra code like which part do you lean on? I think to be safe you should find the, the beauty and the, the artistry and the skill in both right from scratch. So like there should be some percent of your time just writing from scratch and some percent viaconi there should be more of the time writing from scratch. If you are interested in learning how to program. Unfortunately you're not going to get fit by watching fitness videos. You're not going to learn how to play the guitar by watching YouTube guitar videos. You have to actually play yourself. You have to do the sit ups, programming, understanding, learning almost anything requires you to do. Humans are not built to absorb information in a way that transforms into skills by just watching others from afar. Now ironically, it seems AI is actually quite good at that, but humans are not. If you want to learn how to become a competent programmer, you have to program. It's really not that difficult to understand Now. I understand the temptation and the temptation is there, but because Vibe coding can produce things, perhaps in this moment, especially in a new domain you're not familiar with, with tools you don't know perfectly well, that's better than what you could do or that you would take much longer to get at. But you're not going to learn anything. You're going to learn in this superficial way that feels like learning, but it's completely empty calories. And secondly, if you can just Vibe code it, you're not a programmer, then anyone could do it, which may be wonderful. That's essentially what happened with the Access database. That's what happened with Excel. It took the capacity of accountants to become software developers because the tools became so accessible to them that they could build a model for how the business was going to do next week. That required a programmer prior to Excel. Now it didn't because they could do it themselves. Vive coding enables non programmers to explore their ideas in a way that I find absolutely wonderful, but it doesn't make you a programmer. I agree with you, but I want to allow for room for both of us be wrong. For example, there could be. Vibe coding could actually be a skill that if you train it. And by Vibe coding, let's include the step of correction, the iterative correction. It's possible if you get really good at that, that you're outperforming the people that write from scratch, that you can come up with truly innovative things. Especially at this moment in history. While the LLMs are a little bit too dumb to create super novel things in a complete product, but they're starting to creep close to that. So if you're investing time now into becoming a really good vibe coder, maybe this is the right thing to do. As if it's indeed a skill. We kind of meme about Vibe coding just like sitting back and in the. It's in the name. But if you treat it seriously, a competitive Vibe coder and get good at riding the wave of AI and get good at the skill of editing code versus writing code from scratch, it's possible that you can actually get farther in the long term. Maybe editing is a fundamentally different task than writing from scratch. If you take that seriously as a skill that you develop. I see. I don't. To me, that's an open question. I just think I'm. I personally. Now you're on another level, but just me. Just, just personally, I'm not as good at editing the code that I didn't write. There's a different. No one is of this generation, but. But maybe that's a skill. Maybe if you get on the same page as the AI, because there's a consistency to the AI. It's like it really is a pair programmer with a consistent style and structure and so on. Plus, with your own prompting, you can control the kind of code you write. I mean, it could legitimately be a skill. Like, that's the dream of the prompt engineer. I think it's a complete pipe dream. I don't think editors exist that aren't good at writing. I've written a number of books. I've had a number of professional editors. Not all of them wrote their own great books, but all of them were great writers in some regard. You cannot give someone pointers if you don't know how to do it. It's very difficult for an editor to be able to spot what's wrong with a problem if they couldn't make the solution themselves. Editing, in my opinion, is the reward. The capacity to be a good editor is the reward you get from being a good doer. You have to be a doer first. Now, that's not the same as saying that vibe coding prompt engineering won't be able to produce fully formed amazing systems even shortly. I think that's entirely possible. But then there's no skill left, which maybe is the greatest payoff at all. Wasn't that the whole promise of AI anyway? That it was just all natural language? That even my clumsy way of formulating a question could result in a beautiful, succinct answer. That actually to me is a much more appealing vision, that there's going to be these special prompt engineering wizards who know how to tickle the AI AI just right to produce what they want. The beauty of AI is to think that someone who doesn't know the first thing about how AI actually works is able to formulate their idea and their aspirations for what they want. And the AI could somehow take that messy clump of ideas and produce something that someone wants. That's actually what programming has always been. There's very often been people who didn't know how to program, who wanted programs, who then hired programmers, who gave them messy descriptions of what they wanted. And then when the programmers delivered that back, said, oh no, actually that's not what I meant, I want something else. AI may be able to provide that cycle. If that happens to the fullest extent of it. Yeah, there's not going to be as many programmers around, Right. But hopefully, presumably, someone still, at least for the foreseeable future, have to understand whether what the AI is producing actually works or not. As an interesting case study, maybe a thought experiment, if I wanted to Vibe code Basecamp or, hey, some of the products you've built, like, what would be the bottlenecks? Where would I fail along the way? What I've seen when I've been trying to do this, trying to use Vibe coding to build something real, is you actually fail really early. The Vibe coding is able to build a veneer at the current present moment of something that looks like it works, but it's flawed in all sorts of ways. There are the obvious ways, the meme ways that it's leaking all your API keys, it's storing your password in plain text. I think that's ultimately solvable. Like, it's going to figure that out, or at least it's going to get better at that. But its capacity to get lost in its own labyrinth is very great right now. You let it code something and then you want to change something and it becomes a game of Whack a Mole real quick. Peter Lovells, who've been doing this wonderful flight simulator, was talking to that, where at a certain scale, the thing just keeps biting its own tail. You want to fix something and it breaks five other things, which I think is actually uniquely human, because that's how most bad programmers are. At a certain level of complexity with the domain, they can't fix one thing without breaking three other things. So in that way, I'm actually. In some way, it's almost a positive signal for that. The AI is going to figure this out because it's on an extremely human trajectory right now. The kind of mistakes it's making are the kind of mistakes that junior programmers make all the time.