- [upbeat electronic music] - Hey, welcome to "Ars Live," this is Benj Edwards, I'm a senior reporter... Senior AI reporter at Ars Technica, and today, we're diving into one of the biggest questions in tech, "Is the AI industry in a bubble, and if so, when will it pop?" And joining me is Ed Zitron, who I didn't even ask if that's how you pronounce your last name. - That... You nailed it. Don't worry. - Okay. - Fantastic. - Host of the "Better Offline" podcast and CEO of the media relations firm EZPR. Ed writes the newsletter, "Where's Your Ed At," where he's become one of the tech industry's most vocal critics, particularly when it comes to AI hype, and over the past few years, Ed has argued the generative AI market is, in his words, "A 50 billion dollar revenue industry masquerading as a one trillion dollar one," and today, we're gonna discuss the AI boom, whether it's sustainable and stuff, but, like, when I told people I was gonna have you on my show, Ed, everybody... This is not a show, but my broadcast, like, everybody's... I heard from so many people, 'cause you're a very controversial figure, some people are like, "Are you gonna grill him? Are you gonna, like...?" "Do you agree with him?" Some people love what you write, some people don't like it that much, but I, you know, personally, find your voice to be... I find you to be an important critical voice in the tech industry because there's so much, like, sort of, like, a mindless hype train that goes on because money, you know... Everybody wants the money to come down from the sky, and so, they just do the chant for the money and whatever it takes to get the money, you know? So there's... It's hard to speak out and stand separate from that crowd, especially when a lot of people's jobs are on the line for that, like if you work for a tech company or if you work even as a journalist sometimes. I have never had any censorship or... [no audio] Anything myself, but there's always... So I really appreciate your ability to speak out... [no audio] Openly and put... And you froze. - Yeah, you froze on your end, sadly. Yeah, I'm losing you. I don't... - My Internet connection... [indistinct] Anyways, so if I am... Am I unfrozen now, Ed? - Yes, you are. - Okay, good. Sorry about freezing, it happens sometimes. So I was hoping we could just have a kind of a conversation, like, as if we sat down over coffee or something, so it wouldn't be, like, a canned interview, but, like, these are the kind of things I would ask you if we just met for the first time, and, you know, one of my questions I would ask you is why are you so mad about AI? - Because everybody's acting like it's something it isn't, they're acting like it's this panacea that will be the future of software growth, the future of hardware growth, the future of compute, when you actually look at the raw numbers, putting aside the efficacy, which I'll get to, the numbers are bad, like, I... The 50 billion dollar number, it's around that, even if you include things like CoreWeave, Nebius, Lambda, all the AI compute companies or the hyperscalers, OpenAI, Anthropic, so on and so forth, it's a tiny industry compared to the amount of money going into it, on top of that, the models just do not have the efficacy, they don't do the things that people say they do, AI agents is one of the most egregious lies the tech industry has ever told bar none, like, autonomous agents don't exist, and OpenAI as an organization, terribly run, burning billions of dollars, but ChatGPT has been sold on mythology, all of this has, but ChatGPT in particular is just not that... It's not good software, most of this is not good software, you cannot consistently rely on it to do any one thing ever, yet you read the headlines and they talk about, "Wow, the power of AI, the powerful AI models," and the actual things they can do, they're kind of cool at best, but for the most part, just fairly mediocre and so expensive, like, if... I think if customers bore the costs of these, the actual costs, no one would even think about this for a second. - Yeah, that's a good point. One of the things I... The difficulties I have with your criticism is that I use AI chat bots every day now to help me brainstorm and stuff, you know? It doesn't write my articles or anything 'cause it's... That's not what we do, it's not our policy, and it's not allowed, and it would suck, 'cause it's not a good writer, but... So I find use like AI models as sort of knowledge translators and framework translators, like, to... And, like, a sort of a memory augmentation that... Ever since I had... I had COVID, like, so many times, and I've had some brain fog issues and... - I'm sorry. - ChatGPT is great for, like... If I can't put my finger on what this thing is called and I can't remember it, I ask, like... You can describe it in a, like, a roundabout fuzzy way and get an answer pretty quickly, and then, you can verify it, but, you know, you would never search... [no audio] For... Or if you didn't... Could agree with you that AI... These AI models are not what they're billed to be, you know? They are not people, they are not replacements for labor, they are... Like, potentially at best, yeah, some kind of augmentation tool... Sorry, somebody said I dropped out? - You did a little bit, but I can... I'm keeping up. - Okay, good. Yeah. So yeah, if you missed the question, I was talking about why I was at mad AI, that's all, so... And anyway, so the thing is, you know, my big beef with the AI industry right now is the marketing that they are people replacements... [no audio] - Ooh, his Internet. [no audio] [indistinct] [no audio] - I'm so sorry, I've lost you, mate, so I don't know if it's just me talking to the crowd, hello, Ars Technica. [no audio] [upbeat electronic music] [upbeat electronic music continues] [upbeat electronic music continues] [upbeat electronic music continues] - [Lee] Ed, give me a holler when when I'm up, or say that they're... You're bringing a better editor up to talk to talk to stuff. [laughs] - No, it's a substitute quarterback situation. You're the Joe Flacco of Ars Technica. - [Lee] Oh, there we go, I think we're on the... I'm watching the stream here, I think we're at, "Please Stand By." - It's the "Simpsons" thing with the guy with the drink. - I know, right? I've got some backup questions, and I'm on wired Ethernet, so hopefully I won't drop out. There we go, I'm watching the stream here. [no audio] Okay. Am I up? [no audio] - Oh, hey, I can see me on the stream, all right, so... [laughs] So hey, everybody, my name's Lee Hutchinson, I'm subbing in for Benj here real quick while Benj works his Internet issues out, so Ed, I've got some backup questions we can switch to, kind of the same topic, and it's stuff that I'm intensely curious about too, I was gonna kinda wait and hold these until the end, and then, feed them to Benj, but let's go now. I'm genuinely curious here, when, you know... If we are in a bubble, when the bubble pops, what breaks first, like, foundation model vendors? Does the GPU supply chain collapse? Like, what's the first thing that breaks? - So if I knew, I'd be far richer, I don't have any stocks, I don't buy stocks, I don't hold stocks, like, I'm an animal, but it's gonna be one of a few things, it could be this Oracle news that came out today that they lost 100 million dollars in three months with the installation of Blackwell GPUs, we don't know exactly what that means, and indeed, we may never know, but it definitely means that Blackwell GPUs, the new NVIDIA ones, are definitely worse margin wise because they're extremely power hungry and expensive to run and all of the... All of that good stuff. It's more than likely gonna be a succession of bad news pieces, little pokes at the bubble themselves, it'll be an AI startup failing, I mean, Perplexity last year spent 164% of their revenue on AWS, Anthropic, and OpenAI, could be Anysphere, who makes Cursor, could be... I don't know, Cognition's always a good choice, they suck, horrible 997... What was it? 997 culture there, but regardless, it could be an AI startup, I think the foundation models are actually the last ones to go, but I think something with CoreWeave could break, the thing about this is AI is sold on myth, it's not sold on revenue because the revenues are complete crap, everyone's losing money, even Oracle is clearly losing money on selling GPU access, everyone's losing money, everyone hates doing this, everyone's miserable doing this, even the foundation model companies are kind of tired of it, so it's going to be a succession of bad news until just the market freaks out, now, the real things to look at are you've got Microsoft's earnings on October 27th, at any point, one of the hyperscalers could just say, "We're gonna reduce capex on AI," and be very definitive about that, that will freak out the market, it could be something around the AMD-OpenAI deal, which is one of the most stupid things I've heard in my goddamn life, made me so annoyed when I read about it that I couldn't get back to sleep, 7:00 in the morning, terrible sleep that day, but fundamentally, it's going to be... It's not gonna be, like, one Bear Stearns moment, it's gonna be a succession of events until the markets freak out, and that will be when there is a capital crunch on startups that AI startups will just not be able to raise and they have to raise 'cause all of them have terrible margins, they're all... Like, all losing money, it's gonna be when the money starts running out or the doors close leading to the money, and the reason I can't give a compelling one answer is we're in "Looney Tunes" land, the AMD deal is the stupidest goddamn thing I have seen, I think Pets.com was smarter, I think Webvan was smarter, I think the goddamn Metaverse was smarter than the AMD deal, I actually think Lisa Su was doing a clever one, though, 'cause 10%... The first tranche of that 10% of stock is based on OpenAI building a gigawatt of data centers next year, takes about two-and-a-half years per gigawatt so no ground has been broken, imagine if we had a financial and tech press that could say things, like, "Hey, where are those data centers getting big?" "No need, they'll work it out, the American frontier will simply absorb the space..." "Will absorb that space to build more data centers, which pop up overnight," so yeah, not a particularly satisfying answer, but those are the bits I'm looking out for. - I think we've got Benj back, but before I switch back to him, I really would like to ask one more thing, I had kinda my list of backup questions, and now that we're talking, I actually am intensely curious about this next one, what...? And I don't need an exact dollar amount here, so don't, like, feel compelled, like, I need a dollar amount, but, like, where does the monthly average revenue per user kind of break for a productivity Copilot-style assistant to be worth it to provide on the backend, for the backend providers to actually, you know, make it as a profitable product? Like, does this $30-a-month price point that we're being hit with now, is that, like, even remotely close to, like, a real price point? - No, so the problem is... And there's a company called Augment Code, I'm not sure if you heard of them, that's a great example, the problem is AI and... Basically, token-based models, AI models do not append to monthly subscription costs, which is crazy, but there is this company called Augment Code, I believe it's... [indistinct] So a coding environment, and they had one guy who spent... Who was costing them $15,000 a month. There is an Anthropic Claude Codes customer, their apex capitalist predator is one Chinese guy who spent $50,000 a month, Viberank is a list of all the Claude users and how much they burn, there's people doing ten, five grand a month on there, the problem is that you can't actually do cost control on models, you just can't do it. Sora, you can sit there and you can... It will tell you nine times out of 10, "No, this breaks our copyright..." Sorry, not copyright, "This breaks our protection mechanisms," then it will run one prompt, it will accept one prompt, the same one, you just brute-force it, these models cannot... And it's not that they're so powerful they can't be controlled, due to the nature of how one interfaces with a large language model, you can't actually, with the system prompt, stop it from doing everything, it's one of the reasons they're so resistant to doing any kind of regulation around these models and kids because you can't actually fully restrict them without just doing onerous levels, so this is the thing, this is actually what makes AI kind of unique and insane, you can't tell how much a user will cost you. One asshole could say, "Oh, yeah, I'm gonna just mess around with that," it'll cost you two dollars a month on a 20-buck subscription, fine, and then, there'll be one other asshole that costs you 50, 100, 150, 200 bucks, OpenAI's losing money on their $200-a-month subscription, I mean, it's very bad and it's not getting better 'cause the cost of inference is going up. - Yeah. I'm back now, I believe. - What's up? Sorry, my Internet decided to die right in the middle of this and it hasn't... It's been fine all week, so anyway, the tech industry is upset about this, that you're trying to pull the plug. - They're trying to keep us down. - Man, anyway, that's a joke, so, like, I was gonna ask you just briefly, do you use AI for anything, Ed, is it useful to you in any way? - I've played around with it, just to be clear, people, I've had silly people suggest that I don't talk to people in the industry and don't use it, I've used it, I've tried. I'm unimpressed. I don't... If I need to brainstorm, I'll think about it or talk to a friend, I... Actually, your example around brain fog, I'm sorry you're going through that, I have friends who have had long COVID, like, I understand how that might be helpful, that's cool, I'm glad that that helps you in that way, it's not a trillion-dollar use case but it's still... It's a use, I personally... Like, I can't code but I've messed around with Claude Code a bit and talked numerous times, many times over to people like Nik Suresh, who's a great guy who wrote, "I Will Piledrive You If You Write..." "If You Mention AI Again..." [indistinct] Carl Brown, like, I talk with people who really know this stuff so that I have a fundamental understanding of what's going on, I mean, I have tried things like, "Hey, make a PowerPoint presentation," just to see, and it never does it right, it never does it... The one thing it's meant to do is generate stuff and it doesn't generate, Sora 2, I've played with, what an ugly piece of crap. What an horrible product, horrible interface, inconsistent product, based on Azure pricing, it's five bucks a generation, and that's what the old Sora model costs, and they're 100% losing more than that, and that's the thing, like, I've tried because, I mean, whatever it would be... If I liked it somehow, it would be actually a more interesting story because I'd be talking about something I liked that was also onerously expensive 'cause I criticize it on that level, but it doesn't even do that, and it's actually one of my core frustrations, it's, like, this massive overpromise thing, and I'm an early-adopter guy, I will buy early crap all the time, I bought an Apple Vision Pro, like, what more do you say there? I'm ready to accept issues, but AI is all issues, it's all filler, no killer, it's very strange. - Yeah, I agree that there's a giant disconnect between the promise and the delivery of the product of generative AI models right now, what I think is probably gonna happen is that I do think the cost of inference will go down, or at least the cost of hardware will go down eventually, and there'll be commodities, like, that will generate stuff, and, you know, then, I mean, there is no trillion-dollar use case at the current cost, but I believe that at the... A cheap cost, when AI is integrated into operating systems and runs on everybody's PC locally, it'll be just a thing that people use. - Well, man, I appreciate that point, but first of all, the costs are going up, like, unilaterally across the board, even with ASICs, Cerebras is able to do things faster, Grok is able to do things faster but not cheaper, no one is, but on a fundamental level, let's say it was profitable. Let's say the thing today was profitable, do you really think integrating it on an operating system level would be that useful? 'Cause it can't do distinct commands, large language models are terrible at definitive things, like, they... Like, deterministic rules with these things are really rough, I'm... I would buy the idea that this will eventually run local, I 100% believe that in the future at some point, they will have very expensive coding LLM machines, you've already... What is it, the DGX Spark or whatever...? That NVIDIA, the $3,000 machine, you'll have versions like that, I think that that's realistic, but cost isn't coming down, it's going up, which is crazy as well 'cause it's meant to go down over time, that's how... That's generally how the tech industry is meant to work, right? - I think it will eventually work that way, we're... I think we're in a... Like, we're in a curve of, like, this massive demand for something that is not being met, and they are throwing a lot of GPUs at it and it's not getting efficient fast enough to meet that, you know, like, I think... I'm talking years here, it'll be cheaper, right now, it is... - It's also not getting more efficient. - Well, I think that, you know, chips always get more efficient, it's, like, the history of technology, there was a time when, you know, a computer, they... The Air Force built these, like, four-story buildings in the '50s that had a computer that would do 75,000 operations per second and used two megawatts of power, and now, we've got millions of times more power in a tiny phone in our pocket, and I think that will tend to continue over time. Now, whether the AI models will be useful, you know... [Ed stuttering] Did I freeze again? - I... You're frozen, but I can hear you when you're talking, that point refers to something... Like, that doesn't mean that it's getting more efficient, like, the whole thing is that this is not necessarily just a hardware problem, it's also a software problem, but it's also the fundamental underlying technology of these models, inference, is not cost-controllable, and he's gone. Bugger, that was a good point. Sorry, everyone. My points are too angry and I'm making him disconnect. [no audio] Are we still on? - Okay, stand by, let me take back over from Benj real quick, he's continuing to have connection difficulties, apologies, everybody. [Lee clearing throat] Ed, I'd like to... Since I don't have Benj's questions in front of me but I do have, like, an awesome standby list here, I wanted to ask a little bit about your argument in "There is No AI Revolution" about this all being a cynical con, I mean, cynicism is a hard word, right? I mean, is there cynicism in the entire stance that the industry has taken right now? 'Cause, you know, we see this all couched in such, like, positive terms. - It is cynical because they're lying, like, because Sam Altman and Dario Amodei are going out there just being like, "This will replace all coders, this is gonna be the most powerful assistant of all time, this will be, this will be, this will be," as they know, as they look at the models themselves, as they have the totality of knowledge about them and they know, because you know this if you look at them remotely, that they cannot do that, that the whole idea of an assistant is consistency, we don't hire an assistant so that they can dittle around on Google Calendar, we hire them because we know they're gonna get it right, they... We know they're gonna understand our worlds, but also, our nuances, the context of who we are, where we are, what we're doing, an assistant cannot do that based on data, and also, a large language model is bad at determining stick functions, you can't rely on it. It is cynical to do this, and it's especially cynical for Sam Altman to be promising Oracle 300 billion dollars, to promise NVIDIA they'll build 10 gigawatts of data centers. Abilene, Texas does not have the power to power Stargate. The power is not there, yet they are flying reporters out to Abilene, Texas, middle of... I was about to say a word I can't say on the line. - [Lee] [laughs] Thank you. Appreciate it! - Self-censored there, but they're flying people out there to go, "Wow, look at this dirt," when the power isn't there. It is cynical, it's cynical because they are willfully misleading people, this isn't people going around saying, "Yeah, it's expensive but it's experimental cloud software, we need these GPUs, they're really expensive so maybe we don't get a lot of them, we need to focus ourselves," no, it's worse than that, it's like watching a bunch of rich kids swan around claiming they're living their lives as they pop from five-star hotel to five-star hotel, setting fire to the furniture every time they stop and telling us that this is the future of heat. It's a disgrace and it is cynical because I'm sure there are tons of LLM people out there who are like, "Ah, I find this is an interesting tech," like, Benj himself was even like... He has no illusions about what this can do, he's very clear about the use cases, that is not how this is being sold, that's not how this is being raised, that's not how the financial media is framing it, and it's certainly not leading anyone to do the depth of financial analysis they need to do on a day-to-day basis at many of these large outlets, and it's frustrating because in the end, the people that will get hurt are retail investors, regular people and people's 401Ks, people's retirements are gonna get savaged by this, right now, they're doing great, and because they're doing great now, everyone's saying, "Nothing bad ever happens to us, nothing bad ever happens." - Let me change tacts a little bit, because I appreciate what you're saying and, you know, I don't think you're wrong, and apologies to the chat, I'm hearing that my mic is a little loud, I didn't have time to pre-tune, I didn't think I was gonna be on here, so if I'm loud, I'm so sorry, but I wanted to ask sort of... This is more of maybe a more positive question, is there anyone out there who... Any of these AI companies so far who have nailed sustainable paid data licensing and done it in a way that's responsible and that compensates creators appropriately? I mean, is that even possible under the current model? - I don't believe so, I believe there is... I don't even remember the name, there is one profitable AI company I can think of and that is a gadget company that I can't remember the name of that was maybe in Bloomberg or the Times, and they don't actually use models much, they feed user data into a model and summarize... It's not really an AI company at that point, it's just a hardware company with cloud software, the whole licensing thing is, no one wants to be nice about that because the moment you start being nice, you give people precedent to start suing other people, so someone should really do it, but it's already too late to do this, there is no avoiding, I believe at this point, the theft of the common crawl, which is the base that everyone seems to train on, and the moment they start having to hand out money for this, as Anthropic is seeing, that's when it gets really expensive, the answer would've been, by the way, to not do this, but we've already done it. - You know, this is really interesting, I don't wanna take time up here and I see Benj is back, so I'll drop off here in just a second, but my first job out of college was at... Back in the 2000s was at a startup and they've since folded, the startup was called Questia, and their product was they were gonna take... They were gonna have an academic research library available online with built-in, like, bibliography building tools where you could, you know, highlight passages and it would build your works-cited list for you, and they were gonna sell, essentially, is a subscription service to university students so they could, you know, help them write research papers, to acquire that library, to actually get the books in, they had, like, a 200-person copyright research department and that's all they did, they researched copyrights on books, they contacted authors, they arranged licensing, this is... The company tried to do it the right way, to actually approach individual rights-holders of orphaned work... Or of older works, in some cases, orphaned works, try to track down who had them and to negotiate for the usage rights, and then, to acquire a physical copy of the book and to digitize it, like, on-site with that physical copy and doing it that way, while that was the right thing to do and it properly compensated everybody, it was expensive as hell and it was massively time-consuming, and not just time-consuming, I know, right? Like, it took, like, a huge amount of physical resources to do. As someone who is paid, you know, from the sale of copyrighted works, you know, I work at Ars Technica, I write stuff for a living, I see both sides of this, like, you know, you want the tools, you want the neat stuff, but, like, working through the system is ludicrously complicated. - Yeah, and I want seven billion dollars, but if I start robbing banks, they're gonna arrest me. - Me too, right? - Like, if I want to... I would love to... I would love all sorts of things, there's all sorts of crap that I want, but it's like, yeah, it's expensive, that's because you're taking everyone's stuff. Like, it isn't particularly nuanced here, they have no interest in doing that and they're seeing how far they can get, and everyone has kind of agreed to do the same crappy thing. - Yeah, it's a problem. I'm back, sorry, this has been not my day, but anyway, so I missed what you're talking about but we can pick up with maybe a question about the bubble since we... Like, I could talk to you all day about all kinds of other things, but, like, to you, let's define the scope of the bubble since we don't have a ton of time, and by the way, we're gonna take questions at the very end, I was supposed to tell you that earlier and I apologize, but... So what is the AI bubble to you? Can you just help me define it, what we are seeing going on? - So there used to only be one, but now, it's two. Before the neocloud era, the CoreWeaves, the Nebiuses of the world, where they were just selling AI compute, the big bubble was the foundation model companies, the OpenAIs of the world, and indeed, the resulting startups, now that NVIDIA must continue growing forever, it is now maybe two or three, it is the AI startups, so the OpenAIs and the connected bits, and then, the contagion of AI compute, the obsession, and the unhealthy obsession, I should add, of the markets with NVIDIA and any cloud-related stock which even smells of AI, and it wasn't as bad, but now that the market has become so addicted to NVIDIA, and the numbers are so high and so stupid, now, it is a massive bubble of overpromises, people trusting Sam Altman, a mistake that too many people are making and everyone will regret one day, I'm sure of it, now that this bubble... Now, this bubble has escaped fully into the private market... Public markets, 'cause before, it was private startups, which was bad and would've resulted in venture capitalists losing money, which has happened before, though I must add, this is nothing like previous bubbles, there are comparison points, but it's not the same thing, history is new, but now that it has got to the point where you see AMD and Oracle getting boosts off of OpenAI saying, "I will give you a bazillion dollars in two weeks," and everyone goes, "Wow, arf, arf, arf, this is amazing," that's the real thing, that's where the risk has escaped, but really, the third one is also private credit. There's apparently 1.2 trillion dollars' worth of debt built up in AI now, this is not good considering how little money there is and how much money's being burned, and they... I have compelling evidence I've published there that AI compute companies are not making money, that the demand is not there for AI services, I am correct, and as a result of me being correct, we're in real trouble once the market accepts this 'cause all of those neoclouds, CoreWeave for example, is mostly hyperscaler or OpenAI revenue, tell you... Just a simple question, what happens when OpenAI is meant to pay these people? Are they gonna pay them in funny money? Are they going to pay them in Monopoly bucks? No, they need real money, except they burned, I think, 9.7 billion dollars in the first half of this year, it's all very bad, and when you say it out loud, it doesn't feel better. - Yeah, it's... I... The way I see it is, like, I feel like the companies... The tech companies are in a unique position right now in history that's... Reminds me of the gilded age of the 1880s, 1890s when the industrialists had massive monopolies 'cause there was no regulation yet, and they figured out how to turn industrial machines into these massive money-making businesses and these people got so extremely powerful, and I think that what we have today probably dwarfs that in terms of capital available, so I feel like, you know, they're gonna burn through their capital first, like, these companies are subsidizing AI, like Amazon, Google, Meta, Microsoft, because they have side businesses, and so, what...? Who hurts...? Like, who gets hurt the worst when the bubble pops? It feels like the big companies could survive, but, like, do you think OpenAI will get acquired by Microsoft, or do you think they'll disappear, or what do you think's gonna happen? And, like, Anthropic for example too, what happens to them? What do you think? - So OpenAI is an interesting one because I think that... Matt Hughes, my editor, has this theory that they'll become a PO box one day, just, like, a patent troll, just, like, a series of... A company that just sues other companies that has, like, one guy and a few lawyers, I think OpenAI, if they can't convert to a for-profit by, call it February next year, done-zo, I think they just start running out of money and why would...? If OpenAI cannot convert to a for-profit and make the plans to go public, they are functionally a Ponzi scheme, and I don't like saying that 'cause people misuse that term, but if this company cannot go public, there are no liquidity events other than other money entering the system so other money can exit. Now, if they convert, at some point, OpenAI is gonna have a cash call, at some point, they're gonna have to give Oracle some of that 300 billion dollars, they're gonna have to build that data center, they're apparently raising tens of billions of dollars, tens of billions of dollars is not gonna make a dent in how much they owe, like, they... - So what...? Like, what do you think...? Like, what's the plan...? What is Sam Altman's plan here? He's gotta have some plan of how this ends, the endgame is what, a super-god, super-intelligence that makes infinite money or what? What do you think? - I actually don't think that there is a plan at all, I don't think there's ever been one, I think they throw everything at the wall, I think they're used to everyone saying, "Yes," to them, the reason I used the rich kid example is everything's been given to Sam Altman. The press have done it too, they've accepted every line he's given, the financiers have done the same thing, the big companies are all run by business idiots who don't know what they're talking about, so they're like, "Whoa, everyone's talking about Sammy," "Clammy Sammy" has got one on everyone, everyone's buying his crap, and here's the thing, all of those promises will come back to hurt him because he's now promising real companies real money, it's no longer him just saying it will become AGI in two minutes, it's him saying, "NVIDIA, I'll build you 10 gigawatts of data centers," it's, "AMD, I'll build you six gigawatts of data centers, Oracle, I'll pay you 300 billion dollars across five years," CoreWeave, what, is 22-point-something billion dollars, at some point, Mr. Altman's gonna have to pony up and Mr. Altman don't got the cash, no one's got the cash, so Altman is an operator, he is a guy who's good at getting yeses, he has no vision, look at anything OpenAI has done in the last few months, even the last three weeks, everyone's freaking like, "Oh, OpenAI, they did apps, they're bringing apps to ChatGPT, oh, I have a short-term memory issue," because they announced that in 2023, some of the same partners too. There was already an apps SDK in ChatGPT, they just brought it back and it was like, "Ooh, wow, new thingy!" Sora, one of the most embarrassing launches of all time, and at five dollars a video it's costing them or more based on my analysis, and on top of that, it's an ugly, stupid app best known for copyright infringement, a marketing tool framed as technology. All of this stuff is to say there's no strategy, what is OpenAI? And Sarah Friar, their CFO, has signed off on projections saying that they'll make more money than Meta, 200 billion dollars in 2030, and they'll be... They'll have positive gross profit margins as well, how will they do it? Bugger all knowledge, there's no actual thing, this is a company flying by the seat of its pants, taking advantage of a media that's asleep at the wheel and investors that don't know anything, and the frustration I have is that it's blatantly obvious, and when all of this falls apart, people are gonna write these articles saying, "How could we possibly know?" And I'm having all of their heads, every single person who writes an article like that, KOS, I'm serious, I'm... Because guess what? The reason, the way we would've known that this was gonna happen would've been bloody looking. - Reading your newsletter, maybe? [laughs] - Yeah, or I don't know, looking at the numbers. OpenAI was... They... We knew they'd lose five billion dollars in June of last year, it was... Mira Murati, I believe, at the Information, like, the Information's been really good on publishing this, but the facts have been out there for some time, I've been building an argument over time, but a lot of the things I reference are from early this year or middle of last year, no one wanted... - So, is there any path... Well, let's say... I mean, let's say Sam Altman is right that AGI comes and replaces people, workers or whatever and they get to deploy labor on a machine, I mean, does that make all the money they need to survive, even if that happens, do you think? Fantasy scenario. - If a frog had wings, it could fly, like, first of all, Sam Altman himself said that AGI was not a useful term about a month ago, but what is AGI at this point other than slavery? Because that's what people are describing with AGI, a conscious being that we trap in the computer to do our bidding, that's slaves, but putting all that aside, what does AGI mean? Is it an autonomous computer? - Well, it's a nebulous marketing term, it's very hard to define, I just wrote an article about that too a couple months ago, they can't define it and, you know, I think it's probably time to move beyond the AGI thing, but I think it's part of Sam Altman's, like, whole selling package so it's hard to do. - And Dario. - Yeah, that too. So let's see, what happens when it pops? Let's also... Like, what about all these big data centers they're building? What do you think is gonna happen to them? Are they gonna be a waste? Will they put to use? What do you think? - Waste. GPUs do not have a ton of use cases. You've got data analytics, you've got... Crap, you've got, like, science stuff, you've got video, you've... But nothing that's, like, hyperscaler... When I say, "Hyperscaler," I mean, like, big growth, like, meaningful growth at the hyperscalers, like, there's nothing like that, I think a number of them never get built, that's my new core because I've been looking up Abilene, Texas, where the Stargate is, there's not enough power. They need a... Over a gigawatt of power, they've got a 350-megawatt power station and a 200-megawatt substation, they need about double that, no one seems to be able to answer those questions, they don't respond to my emails. The point is this is no... These data centers, and I think people... And I say this myself, I've only recently learned, people have a very poor understanding of data centers in general, these things take years to build, but even if you can build the data center, get them all of the water and the power, the power is the problem. A gigawatt of power is a lot and it's not... You don't just build a power station, you don't just... It's not like "Red Alert 2" or "3," two fantastic games, nor are there psychic Russians, but putting that aside, there... You don't erect a power station and it just happens, there are months of actual... Even if you put all regulation aside, you have actual... Like, you actually have to do physics to make sure that it doesn't kill everyone when you... I was just talking to... [indistinct] Actually, about this, it's like when you build a gas pipeline, you have to, like, destroy every community between the power station and the eventual out point, but even then, fundamentally, the power doesn't exist for most of these data centers, it may "exist" in that they can permit it, but building power takes time and it's not something you can rush, you can rush construction, you can throw a bunch of money at it, and then, it also costs a bunch of money, so I don't think... I think tons of these data centers just don't get built fully, I think a lot of people take a bath on them, I think you've got Meta doing Enron crap, you've got Meta doing an SPV, which people invest in, which they pay money into, every bad sign is there, but data centers themselves, they're... Many of them are not getting built. Sorry, I realize we're going slightly over the question. - That's all right, I... That's fascinating because I... You know, I pointed out that 10 gigawatts is 10 nuclear power plants they have to build, so, like, in one of my articles, that's, like... That's not a trivial thing to build one nuclear power plant. - It's not just building them as well, it's actually getting the power to the... Like, it's insane and everyone's just casually throwing around this term, gigawatt here, gigawatt there, the power station in Manhattan is 600 megawatts. I think the one in Queens is, like, 900 a gigawatt, that's for an entire city and it took a lot of money and time to build, it's just not gonna happen. - Yeah, it does... It's... You're painting a compelling case here, I'd say, I'm supposed to move on to audience questions now, we could talk all day, I've got so many more questions we couldn't get to, but I'll just have you back another time on my show next year, I don't know, so Tim Torres asks, "What should the average person be doing to insulate themselves against the impacts of a potential bubble popping? Does that change for white-collar versus blue-collar?" - So the weird thing about this is because there's not that much money in AI, the economic impact isn't going to be... It's not as bad as the great financial crisis because you've not got millions of people with a bunch of houses that they can't pay for, you've not... I don't think you've got banks running out of money here, that being said, the stock market is going to be quite apocalyptic, I am not a financial advisor, I cannot give you financial advice and I will not, I won't tell you what to short, I won't tell you what to buy. I will just give you the information for you to make decisions. That being said, there's no real money here anymore, like, the amount of money being promised by OpenAI is egregious and ridiculous, and if they've refused to pay or can't pay, that will cause trouble with those stocks, I can't say to what end, though, I can't... Like, I can't, but know that if your eyes and your ears are telling you this stuff isn't revolutionary, if your eyes and your ears are saying your bosses, who seem quite stupid, seem to think this is the future, maybe they're just stupid, maybe it's not you, maybe everyone... Everything you're reading about the power of AI is just marketing, 'cause that's really what it is. - Yeah, maybe Benj Edwards is also telling you these things if you read his articles... - He is, there's a reason... [indistinct] - So that's a good answer, so we're not worried about, like, the whole economy melting down necessarily if this pops, is what you're saying? - The thing is my larger thing I'm worried about is the amount of private credit going into this, the amount of debt here is astonishing and I am... What continues to worry me is the amount of stock pots we see from OpenAI touching stuff, in a sane, rational, moderately realistic market, we wouldn't trust the company like OpenAI who will probably lose about 10, 20 billion dollars, you know, couple billion talking real money there this year, you would say, "Hey, wait a second, they can't afford this," now, this is usually a line in articles, this is usually one line saying, "Yeah, it's not clear how they'll afford this." Buddy, like, the financial and tech press, they need to nut up here and just say, "Hey, they cannot afford this," not, "They don't..." "We don't know how," not, "We're not sure," they cannot, because there is a limit to available debt, there's a limit to available venture capital, I think there's a guy called Jon Sakoda who did analysis that said we'll only have about 164 billion dollars of U.S. venture capital left by the end of the year, OpenAI is, what, meant to take all of that every year? God, no, it's not gonna happen, even if you had some obscene government program, which is not going to happen by the way, they're not going to give them more than 20 billion dollars a year, which won't even pay the flipping power bill, but this is also assuming that OpenAI actually has the users to keep up with that compute 'cause what, they hit 800 million "weekly active users," but even in their own research about how people use ChatGPT, they say they don't deduplicate them, when someone's logged out, they could log... Use another machine to play ChatGPT and... "Play ChatGPT," I sound 100 years old, but to use ChatGPT and they would count that as another user, OpenAI is playing some silly buggers with numbers and I think you're gonna... One day, it's gonna come out how silly those buggers really were, but I think that it's just in this... - Let me interrupt you for a second, just as a continuation, one of the things you said happens to tie into one of these questions, which is "JesseTG" asks, "Do you think anyone is likely to get a government bailout when the bubble pops?" - So this isn't something you can bail out, so to illustrate what the bubble really is, it's NVIDIA. It is NVIDIA, the reason that Oracle agreed to a $300-billion contract the customer can't pay for is 'cause of NVIDIA's stock growth, NVIDIA has been doing insanely well. The reason that the market is scared, the reason the market panicked in January, the reason that the market kinda got wobbly when they put out their earnings is NVIDIA had a piss-poor 55% year-over-year growth rate, which for any other stock, you'd be like, "Hell, yeah!" But for NVIDIA, which had quarters where they're, like, 144%, 256% year-over-year growth, NVIDIA became the largest stock on the U.S. stock market, 7%-to-8% of the value of the S&P 500, everything's around NVIDIA. That's the thing, that's all it really is, it's associated stocks but it's mythology, Microsoft, they stopped announcing how much money they're making in AI, point is, none of the companies that are going to cause the bubble to burst can be bailed out. Even if you bailed out OpenAI, unless they were public, which they are not and they will not be able to do if they can't convert even to a for-profit, they are a private company, thus they would not usually affect the market, NVIDIA, their growth is why the bubble is inflated, if their growth goes down, the bubble will burst, here's the problem, you can't really bail that out, are we meant to nationalize NVIDIA? Are we...? Because just to be clear, it isn't about NVIDIA continuing to profit, it's about NVIDIA continuing to make more profit and more profit growth every single quarter, 55% wasn't enough, but if 55% was what they needed, they need to, in a year, be making 72 billion dollars, and the year after, that'd be over 100 billion dollars, in a quarter, in a quarter! Selling GPUs that lose you money the moment you plug them in, Oracle lost 100 million dollars in three months because of Blackwell, like, it's... I've been saying this forever and I get so animated about it 'cause it's just... It's there, you can see it today. - Yeah, I understand that this... Do you think that's why NVIDIA's doing these circular investment things, like, you know, to keep propping this stuff up, like, "We need...?" - I mean, that's all CoreWeave is, CoreWeave, they literally... They funded CoreWeave, became their biggest customer, then CoreWeave took that contract and those GPUs, and used them as collateral to raise debt to buy more GPUs, that's why NVIDIA did it, it's kind of brilliant, it shouldn't be legal, but it's kind of brilliant. It will fall apart because CoreWeave has 25 billion dollars' worth of debt and their largest customer, OpenAI... Well, they were meant to start paying them October 1st, I do hope that check's clear. Like, it's just, like... It really is, like, when will Wile E. Coyote look down? [Benj laughing] - I am... You know, gosh, I wish I could call that myself, like, who... Someone... Ronan McHugh said, "Who has an interest in actually calling OpenAI's bluff? All the big players want to keep the hype wheel spinning," and yet, at some point, the rubber has to meet the road, right? And what's...? When do you think that will happen? You're not a fortune-teller, but are... Like, are we talking a year, two years, six months? What do you think? - Next year-and-a-half. Like, I think next year-and-a-half but it could be quicker if OpenAI doesn't convert, if they don't convert, it's real pants-down moment for them, there's no way out, this whole nonprofit thing was a horrible idea, but I think the important thing to know here, and to kind of answer one part of the question, is the reason that no one's calling the bluff is we're currently in something I've called the Rot-Com bubble, which is tech is out of hyper-growth ideas, we don't have a smartphone, we don't have a cloud compute, we don't have a new thing that will grow everything forever, generative AI is meant to be the thing that grows everything forever, it's meant to be the panacea, it's meant to be the new smartphone, the new cloud compute thing that they can sell to everyone, it's meant to be the new hardware thing, it's meant to be the new consumer subscription, it's meant to be everything, and it just happens to be the most expensive thing ever that loses tons of money, you can't control the costs, so because it's mythological, 'cause everything is around basically hope and hype, it could be a year-and-a-half, it could be three months, it really depends if a bunch of really crappy news comes out, and I think there will be an AI startup that runs out of money, which will cause all the other AI startups to start running to their venture capitalists saying, "Hey, you know that running out of money thing? "We're also doing that," and then, venture capitalists will start saying, "Crap, is everyone doing this? We need to sell our companies," and then, everyone will try and sell their companies at once, a fire-sale environment will be completed, at which point everyone's valuations will dip, at which point it'll become basically impossible to raise venture capital for an AI company, at which point this will hit the foundation-model companies, now, the reason I say that is the foundation-model companies, OpenAI and Anthropic, make a bunch of money from startups, they make a bunch of API revenue, Anthropic is the most exposed, though, because most of their money comes from settling API access, once that flow of money comes down, it's bad news for them, their second-largest customer, Anysphere, Cursor, Tom Dotan over at Newcomer, he reported that they're sending 100% of their revenue just to Anthropic for compute. All of this is just castles on sand and no one wants to admit that that's the case, but the moment one of them does it, the moment they get the excuse, notice that everyone started saying, "Bubble," after Clammy Sammy said, "AI bubble," everyone's willing to say that now, soon, someone's gonna say, "This is unprofitable," they're gonna use the magic words, and then, everyone's gonna go, "You know what? It's unprofitable, but, you know..." And it's kind of gone from this point when people say, "This is the most transformative technology," to, "Of course, it's a bubble, you moron! Duh! Yeah, we're losing a bunch of money, whatever, it's good, though." - I think... You know, I feel... I have this sense that we've sort of hit a peak of this particular cycle here with AI that, you know, even... When even people like Benj Edwards, me, who are... You know, people accuse me of being an AI hype-ster and all this stuff, and I'm interested in the technology, and I think it's really cool and you can do neat things with it, but I have to tell people it's not what you think it is and it's not gonna do what you think it can do, and the fact that I'm comfortable saying that out loud means I think the sort of environment people are ready to hear that and accept it, people like Sam Altman are accepting there's a bubble, I heard Jeff Bezos say there's a bubble... - But they're saying it in this way that's like, "Oh, it's a bubble for other people, but not me," which is the classic scam artist thing, that's the number-one, like, conman, cult leader thing, it's like, "Yeah, it's a problem, but not for me, I've got a system." - Yeah, I think they also think that there's gonna be a bubble, a burst, and then, some sort of fallout... Like, some kind of long-tail benefits that, you know, continue into the future that aren't of the same scale or magnitude possibly. - Well, I think there's a depression coming, I think once the markets work out that tech doesn't grow forever, they're gonna flush the toilet aggressively on Silicon Valley, it's a bad scene and it didn't have to happen. - Is there anything that would falsify your premise of this bubble and crash happening? Like, is there anything that could actually happen that would not make the bubble happen? Like, what...? Like, you know... Like, what would the technology need to be to...? Like, I kind of asked you this a little bit earlier, but... Yeah, so what do you think? Like, what if you're wrong? Like, you know... I know you don't think you're wrong, but other people do, so why do they think they're right, and you don't...? [indistinct] - I've been answering, "What if you're wrong?" For a year-and-a-half to two years, so I'm not bothered by that question, so the thing that would have to prove me right would've already needed to happen. It wasn't... The point of no return was the Oracle deal. By signing that deal, they put themselves in a position where they cannot win, that deal was an insane move built to buoy both OpenAI and Oracle, it was... Honestly, I don't know how it is legal to sign an agreement, I assume it is because they filed things with the SEC, fine, but I don't know how that's legal, it's insane, the reason I say that is for this to have gone right, for it to right itself, they would've had to find a way to make, honestly, just inference profitable. Now, you may say, "Well, Sam Altman has said we're almost profitable in inference, we're almost profitable in API," Sam Altman said a lot of crap. Sam Altman yaps himself cities like an old woman at a hairdresser, he just says whatever he feels like, the thing that would've had to happen with inference would've had to be, like, minute, it would have to be like hundredths of a cent per million tokens, like, they would have to be printing money, and then, it would have to be way more useful, like, it would have to have efficacy that it does not have, the hallucination problems, which are unfixable per OpenAI and have always seemed unfixable, would have to be fixable, and on top of this, someone would have to fix agents, the problem isn't just that the technology sucks, it's that the technology sucks and everyone has been saying, "This is the biggest, most important, best thing ever and it will do literally everything, it will make your stock go up, it will make your software better, it'll be the best consumer device ever, and also will pay Oracle 300 billion goddamn dollars in five years." Unbelievable, like, I... When I think about this stuff, I get so frustrated because it's nice now, it's nice because people are listening to me, but a year ago, people were like, "You're insane Ed, five billion dollars, whatever, Uber lost that much money," Uber didn't lose even a fucking percentile, pardon my French, of the amount of money that OpenAI has cost in the last two years, but no one wants to go and do the historical reading... [indistinct] - Yeah, may I interrupt you for a second? What changed? Like, you know, I have seen your trajectory since the beginning of the AI stuff, and I really appreciate your... The "Rot Economy" essay was really good and the... You know, some of your... I thought some of your economic arguments were sort of, like, okay, maybe, like, a year ago, but then, OpenAI kept asking for more money, and more money, and more money, and more money, and these circular things start happening and it's... You know, it's starting to look a little weird, so unfortunately, I have to wrap it up, but I wanna conclude with something, you know, you don't like Sam Altman very much, but what's the best thing about Sam Altman? Can you say anything nice about him at all? [laughs] - Why do I need to? - You don't need to, it's a challenge. [indistinct] You can say no, there's nothing... - I understand why you're asking this, but I wanna be clear, Sam Altman is going to be the reason the markets take a crap, Sam Altman has lied to everyone, Sam Altman has been lying forever, Sam Altman misled... He misled his board, he misled Oracle, he misled the markets, he has led.. Like the Pied Piper, he's led the markets into an abyss, and yes, people should have known better, but I hope at the end of this, Sam Altman is seen for what he is, which is a con artist and a very successful one that has proven that we are overseeing our markets, parts of our media, and definitely large parts of government throughout the world, by the way, lots of people have been taken by this, how many business idiots control things, Sam Altman... You know what? I'll say something nice about him, he's really good at making people say, "Yes." - There you go. That's something. Anyway, thank you so much for your time, and I- - My pleasure, thank you for having me. - You've been awesome and I appreciate it, we'll talk again in the future, thank you. - Looking forward to it. - Bye-bye. [upbeat electronic music]