getnormality 3 hours ago

A decade ago, IBM was spending enormous amounts of money to tell me stuff like "cognitive finance is here" in big screen-hogging ads on nytimes.com. They were advertising Watson, vaporware which no one talks about today. Are they bitter that someone else has actually made the AI hype take off?

  • jimmar 2 hours ago

    I don't know that I'd trust IBM when they are pitching their own stuff. But if anybody has experience with the difficulty of making money off of cutting-edge technology, it's IBM. They were early to AI, early to cloud computing, etc. And yet they failed to capture market share and grow revenues sufficiently in those areas. Cool tech demos (like the Watson Jeopardy) mimic some AI demos today (6-second videos). Yeah, it's cool tech, but what's the product that people will actually pay money for?

    I attended a presentation in the early 2000s where an IBM executive was trying to explain to us how big software-as-a-service was going to be and how IBM was investing hundreds of millions into it. IBM was right, but it just wasn't IBM's software that people ended up buying.

    • stingraycharles 2 hours ago

      Xerox was also famously early with a lot of things but failed to create proper products out of it.

      Google falls somewhere in the middle. They have great R&D but just can’t make products. It took OpenAI to show them how to do it, and the managed to catch up fast.

      • SamvitJ an hour ago

        "They have great R&D but just can’t make products"

        Is this just something you repeat without thinking? It seems to be a popular sentiment here on Hacker News, but really makes no sense if you think about it.

        Products: Search, Gmail, Chrome, Android, Maps, Youtube, Workspace (Drive, Docs, Sheets, Calendar, Meet), Photos, Play Store, Chromebook, Pixel ... not to mention Cloud, Waymo, and Gemini ...

        So many widely adopted products. How many other companies can say the same?

        What am I missing?

        • smoe an hour ago

          I don't think Google is bad at building products. They definitely are excellent at scaling products.

          But I reckon part of the sentiment stems from many of the more famous Google products being acquisitions orignally (Android, YouTube, Maps, Docs, Sheets, DeepMind) or originally built by individual contributors internally (Gmail).

          Then here were also several times where Google came out with multiple different products with similar names replacing each other. Like when they had I don't know how many variants of chat and meeting apps replacing each other in a short period of time. And now the same thing with all the different confusing Gemini offerings. Which leads to the impression that they don't know what they are doing product wise.

          • cma 6 minutes ago

            Why wouldn't you count things initially made by individual contributors at Google?

        • aaronAgain 42 minutes ago

          Those are all free products, some of them are pretty good. But free is the best business strategy to get a product to the top of the market. Are others better, are you willing to spend money to find out? Clearly, most people are not interested. The fact that they can destroy the market for many different types of software by giving it away and still stay profitable is amazing. But that's all they are doing. If they started charging for everything there would be better competition and innovation. You could move a whole lot of okay-but-not-great cars, top every market segment you want, if you gave them away for free. Only enthusiasts would remain to pay for slightly more interesting and specific features. Literally no business model can survive when their primary product is competing with good-enough free products.

        • 7thaccount 27 minutes ago

          They come up with tons and tons of products like Google Glass and Google+ and so on and immediately abandon them. It is easy to see that there is no real vision. They make money off AdSense and their cloud services. That's about it.

          • nunez 18 minutes ago

            Google does abandon a lot of stuff, but their core technologies usually make their way into other, more profitable things (collaborative editing from Wave into Docs; loads of stuff from Google+; tagging and categorizing in Photos from Picasa (I'm guessing); etc)

        • falcor84 21 minutes ago

          Notably all other than Gemini are from a decade or more ago. They used to know how to make products, but then they apparently took an arrow in the knee.

        • Esras an hour ago

          I think the sentiment is usually paired with discussion about those products as long-lasting, revenue-generating things. Many of those ended up feeding back into Search and Ads. As an exercise, out of the list you described, how many of those are meaningfully-revenue-generating, without ads?

          A phrasing I've heard is "Google regularly kills billion-dollar businesses because that doesn't move the needle compared to an extra 1% of revenue on ads."

          And, to be super pedantic about it, Android and YouTube were not products that Google built but acquired.

          • projektfu 19 minutes ago

            Before Google touched Android it was a cool concept but not what we think of today. Apparently it didn't even run on Linux. That concept came after the acquisition.

          • MegaDeKay an hour ago

            They bought YouTube but you have to give Google a hell of a lot of credit for turning it into what it is today. Taking ownership of YouTube at the time was seen by many as taking ownership of an endless string of copyright lawsuits, suing them into oblivion.

            • hadlock 14 minutes ago

              Youtube maintains an independent campus from the google/alphabet mothership, I'm curious how much direction they get, as (outwardly, at least) appear to run semi-autonomously.

        • mike50 29 minutes ago

          Search was the only mostly original product. With the exception of YouTube which was a purchase, Android and ChromeOS all the other products were initially clones.

        • m4rtink an hour ago

          Didn't they buy lots of those actually ?

      • eellpp an hour ago

        Google had less incentive. Their incentive was to keep API bottled up and in brewing as long as possible so their existing moats in search, YouTube can extend in other areas. With openai they are forced to compete or perish.

        Even with gemini in lead, its only till they extinguish or make chatgpt unviable for openai as business. OpenAI may loose the talent war and cease to be leader in this domain against google (or Facebook) , but in longer term their incentive to break fresh aligns with average user requirements . With Chinese AI just behind, may be google/microsoft have no choice either

      • mikepurvis 2 hours ago

        Google was especially well positioned to catch up because they have a lot of the hardware and expertise and they have a captive audience in gsuite and at google.com.

    • eru 11 minutes ago

      What you are saying is true. But IBM failing to see a way to make money off a new technology isn't actually news worth updating on in this case?

    • mike50 32 minutes ago

      They were selling software as a service in the IBM 360 days. Relabeling a concept and buying Redhat don't count as investments.

      • hollerith 30 minutes ago

        What is your reason for believing that IBM was selling software as a service in the IBM 360 days?

        What hardware did the users of this service use to connect to the service?

    • nish__ an hour ago

      Neither cloud computing nor AI are good long term businesses. Yes, there's money to be made in the short term but only because there's more demand than there is supply for high-end chips and bleeding edge AI models. Once supply chains catch up and the open models get good enough to do everything we need them for, everyone will be able to afford to compute on prem. It could be well over a decade before that happens but it won't be forever.

      • echelon an hour ago

        This is my thinking too. Local is going to be huge when it happens.

        Once we have sufficient VRAM and speed, we're going to fly - not run - to a whole new class of applications. Things that just don't work in the cloud for one reason or another.

        - The true power of a "World Model" like Genie 2 will never happen with latency. That will have to run locally. We want local AI game engines [1] we can step into like holodecks.

        - Nobody is going to want to call OpenAI or Grok with personal matters. People want a local AI "girlfriend" or whatever. That shit needs to stay private for people.

        - Image and video gen is a never ending cycle of "Our Content Filters Have Detected Harmful Prompts". You can't make totally safe for work images or videos of kids, men in atypical roles (men with their children = abuse!), women in atypical roles (woman in danger = abuse!), LGBT relationships, world leaders, celebs, popular IPs, etc. Everyone I interact with constantly brings these issues up.

        - Robots will have to be local. You can't solve 6+DOF, dance routines, cutting food, etc. with 500ms latency.

        - The RIAA is going door to door taking down each major music AI service. Suno just recently had two Billboard chart-topping songs? Congrats - now the RIAA lawyers have sued them and reached a settlement. Suno now won't let you download the music you create. They're going to remove the existing models and replace them with "officially licensed" musicians like Katy Perry® and Travis Scott™. You won't retain rights to anything you mix. This totally sucks and music models need to be 100% local and outside of their reach.

        [1] Also, you have to see this mind-blowing interactive browser demo from 2022. It still makes my jaw drop: https://madebyoll.in/posts/game_emulation_via_dnn/

        • foobarian 19 minutes ago

          > You can't solve 6+DOF, dance routines, cutting food, etc. with 500ms latency.

          Hopefully it's just network propagation that creates that latency, otherwise local models will never beat the fanout in a massive datacenter.

    • DaiPlusPlus 2 hours ago

      > but it just wasn't IBM's software that people ended up buying.

      Well, I mean, WebSphere was pretty big at the time; and IBM VisualAge became Eclipse.

      And I know there were a bunch of LoB applications built on AS/400 (now called "System i") that had "real" web-frontends (though in practice, they were only suitable for LAN and VPN access, not public web; and were absolutely horrible on the inside, e.g. Progress OpenEdge).

      ...had IBM kept up the pretense of investment, and offered a real migration path to Java instead of a rewrite, then perhaps today might be slightly different?

      • nunez 16 minutes ago

        Websphere is still big at loads of banks and government agencies, just like Z. They make loads on both!

      • Insanity 2 hours ago

        Oh wow I didn’t know Eclipse was an IBM product originally. IDEs have come so far since Eclipse 15 years ago.

        And while I’m writing this I just finished up today’s advent of code using vim instead of a “real IDE” haha

  • stingraycharles 2 hours ago

    I still have PTSD from how much Watson was being pushed by external consultants to C levels despite it being absolutely useless and incredibly expensive. A/B testing? Watson. Search engine? Watson. Analytics? Watson. No code? Watson.

    I spent days, weeks arguing against it and ended up having to dedicate resources to build a PoC just to show it didn’t work, which could have been used elsewhere.

    • ares623 2 hours ago

      It's like poetry, it rhymes

    • 7thaccount 26 minutes ago

      This is going on all over again.

  • stego-tech 2 hours ago

    If anything, the fact they built such tooling might be why they're so sure it won't work. Don't get me wrong, I am incredibly not a fan of their entire product portfolio or business model (only Oracle really beats them out for "most hated enterprise technology company" for me), but these guys have tentacles just as deep into enterprises as Oracle and are coming up dry on the AI front. Their perspective shouldn't be ignored, though it should be considered in the wider context of their position in the marketplace.

  • throw0101a 44 minutes ago

    > Are they bitter that someone else has actually made the AI hype take off?

    Or they recognize that you may get an ROI on a (e.g.) $10M CapEx expenditure but not on a $100M or $1000M/$1B expenditure.

  • broodbucket 2 hours ago

    IBM ostensibly failing with Watson (before Krishna was CEO for what it's worth) doesn't inherently invalidate his assessment here

    • johncolanduoni 2 hours ago

      It makes it suspect when combined with the obvious incentive to make the fact that IBM is basically non-existent in the AI space look like an intentional, sagacious choice to investors. It very may well be, but CEOs are fantastically unreliable narrators.

      • jayd16 2 hours ago

        You expect somebody to be heavily invested currently and also completely openly pessimistic about it?

        • johncolanduoni 43 minutes ago

          No, I don’t trust a word Sundar or Satya say about AI either. CEOs should be hyping anything they’re invested in, it’s literally their job. But convincing investors that every thing they don’t invest in heavily is worthless garbage is effectively part of their job too.

          What is more convincing is when someone invests heavily (and is involved heavily) and then decides to stop sending good money after bad (in their estimation). Not that they’re automatically right, but is at least pay attention to their rationales. You learn very little about the real world by listening to the most motivated reasoner’s nearly fact-free bloviation.

        • Forgeties79 an hour ago

          Yeah I was going to say the same thing ha. I get what they’re (the commenter) saying, but one could also argue IBM is putting their money where their mouth is by not investing.

  • nunez 21 minutes ago

    IBM has been "quietly" churning out their Granite models, with the latest of which performing quite well against LLaMa and DeepSeek. So not Anthropic-level hype but not sitting it out completely either. They also provide IP indemnification for their models, which is interesting (Google Cloud does the same).

  • al_borland 41 minutes ago

    I see Watson stuff at work. It’s not a direct to consumer product, like ChatGPT, but I see it being used in the enterprise, at least where I’m at. IBM gave up on consumer products a long time ago.

    • CrI0gen 19 minutes ago

      Just did some brief Wikipedia browsing and I'm assuming it's WatsonX and not Watson? It seems Watson has been pretty much discontinued and WatsonX is LLM based. If it is the old Watson, I'm curious what your impressions of it is. It was pretty cool and ahead of its time, but what it could actually do was way over promised and overhyped.

  • jimbo808 2 hours ago

    Has it really taken off? Where's the economic impact that isn't investor money being burned or data center capex?

    • bncndn0956 an hour ago

      It's good we are building all this excess capacity which will be used for applications in other fields or research or open up new fields.

      I think the dilemma I see with building so much data centers so fast is exactly like whether I should buy latest iPhone now or should wait few years when the specs or form factor improves later on. The thing is we have proven tech with current AI models so waiting for better tech to develop on small scale before scaling up is a bad strategy.

  • firesteelrain an hour ago

    IBM makes WatsonX for corporate who want airgapped AI

  • ghaff 2 hours ago

    Initial Watson was sort of a mess. But a lot of the Watson-related tech is integrated into a lot of products these days.

    • mikalauskas 2 hours ago

      What related tech and what products, interesting to read about them

      • ghaff 2 hours ago

        Baked into a lot a Red Hat products including Ansible and RHEL. Not that directly involved any longer. Probably read up on watsonx.ai.

    • hn_throwaway_99 2 hours ago

      Such as? I'm curious because I know a bunch of people who did a lot of Watson-related work and it was all a dead end, but that was 2020-ish timeframe.

      • ghaff 2 hours ago

        IBM did a lot of pretty fragmented and often PR-adjacent work. And getting into some industry-specific (e.g. healthcare) things that didn't really work out. But my understanding is that it's better standardized and embedded in products these days.

        • hn_throwaway_99 an hour ago

          Not to be rude, but that didn't answer my question.

          Taking a look at IBM's Watson page, https://www.ibm.com/watson, it appears to me that they basically started over with "watsonx" in 2023 (after ChatGPT was released) and what's there now is basically just a hat tip to their previous branding.

  • Den_VR 2 hours ago

    Watson X is still a product line sold today to qualified customers :)

  • edm0nd 2 hours ago

    Honestly I'm not even sure what IBM does these days. Seems like one company that has slowly been dying for decades.

    but when I look at their stock, its at all time highs lol

    no idea

    • nunez 9 minutes ago

      Mainframe for sure, but IBM has TONS of products in their portfolio that get bought. They also have IBM Cloud which is popular. Then there is the Quantum stuff they've been sinking money into for the last 20 years or so.

    • 7thaccount 18 minutes ago

      My limited understanding (please take with a big grain of salt) is that they 1.) sell mainframes, 2.) sell mainframe compute time, 3.) sell mainframe support contracts, 4.) sell Red hat and Redhat support contracts, and 5.) buy out a lot of smaller software and hardware companies in a manner similar to private equity.

    • darth_avocado 2 hours ago

      They make business machines, internationally.

      • nickpeterson 2 hours ago

        Pretty sure they made all their money fighting the paperwork explosion.

    • mike50 28 minutes ago

      Basic research and mainframe support contracts. Also they bought RedHat.

    • Sl1mb0 an hour ago

      They manage a lot of old, big mainframes for banks. At least that is one thing I know of.

    • broodbucket 2 hours ago

      IBM is probably involved somewhere in the majority of things you interact with day to day

  • MangoToupe 2 hours ago

    > Are they bitter that someone else has actually made the AI hype take off?

    Does it matter? It’s still a scam.

winddude 16 minutes ago

8T is the high-end of the McKinsey estimate that is 4-8T, by 20230. That includes non-AI data-centre IT, AI data-centre, and power infrastructure build out, also including real estate for data centres.

Not all of it would be debt. Google, Meta, Microsoft and AWS have massive profit to fund their build outs. Power infrastructure will be funded by govts and tax dollars.

  • oblio 12 minutes ago

    There is mounting evidence that even places like Meta are increasing their leverage (debt load) to fund this scale out. They're also starting to do accounting tricks like longer depreciation for assets which degrade quickly, such as GPUs (all the big cloud increasing their hardware depreciation from 2-3-4 years to 6), which makes their financial numbers look better but might not mean that all that hardware is still usable at production levels 6 years from now.

    They're all starting to strain under all this AI pressure, even with their mega profits.

1vuio0pswjnm7 4 hours ago

One thing we saw with the dot-com bust is how certain individuals were able to cash in on the failures, e.g., low cost hardware, domain names, etc. (NB. prices may exceed $2)

Perhaps people are already thinking about they can cash in on the floor space and HVAC systems that will be left in the wake of failed "AI" hype

  • blibble 4 hours ago

    I'm looking forward to buying my own slightly used 5 million square ft data centre in Texas for $1

    • jsheard 3 hours ago

      Tired: homelabbers bringing decommissioned datacenter gear home.

      Wired: homelabbers moving into decommissioned datacenters.

      • viccis a minute ago

        I miss First Saturday in Dallas where we honest to god did buy decommissioned datacenter gear out of the back of a van.

      • reverius42 3 hours ago

        More of a labhome than a homelab at that point.

      • renegade-otter 3 hours ago

        "Loft for rent, 50,000 sq ft in a new datacenter, roof access, superb wiring and air conditioning, direct access to fiber backbone."

        • nvader 2 hours ago

          Never worn.

    • WhyOhWhyQ 4 hours ago

      You're out of luck because I am willing to pay at least $2.

      • blibble 4 hours ago

        they'll be plenty for everyone!

    • trhway 2 hours ago

      In TX? In Russian blogosphere it is a standard staple that Trump is rushing Ukrainian peace deal to be able to move on to the set of mega-projects with Russia - oil/gas in Arctic and data centers in Russian North-West where electricity and cooling is plentiful and cheap.

      • cheema33 42 minutes ago

        Build trillion dollar data center infrastructure in Russia. What could possibly go wrong?

        Ask the owners of the leased airplanes who have been unsuccessfully trying to get their planes back for about 3 years.

      • ekropotin an hour ago

        Sounds like kremlebot’s, however it’s unclear for me the motivation behind pushing this narrative. Also, why don’t build DCs in Alaska instead?

        • trhway an hour ago

          actually it is more of the opposition's narrative, probably a way to explain such a pro-Russian position of Trump.

          I think any such data center project is doomed to ultimately fail, and any serious investment will be for me a sign of the bubble peak exuberance and irrationality.

      • oblio 10 minutes ago

        What could go wrong with placing critical infrastructure on the soil of a strategic rival?

      • voidfunc 2 hours ago

        Oil and Gas in The Arctic I can see, but data centers in Russia... good luck with that.

  • 1vuio0pswjnm7 3 hours ago

    From the article:

    ""It's my view that there's no way you're going to get a return on that, because $8 trillion of capex means you need roughly $800 billion of profit just to pay for the interest," he said."

    • bitexploder 3 hours ago

      Right, THEY can't, but cloud providers potentially can. And there are probably other uses for everything not GPU/TPU for the Google's of the world. They are out way less than IBM which cannot monetize the space or build data centers efficiently like AWS and Google.

  • pseudosavant 2 hours ago

    The dotcom bust killed companies, not the Internet. AI will be no different. Most players won’t make it, but the tech will endure and expand.

    • codingdave 2 hours ago

      Or endure and contract.

      The key difference between AI and the initial growth of the web is that the more use cases to which people applied the web, the more people wanted of it. AI is the opposite - people love LLM-based chatbots. But it is being pushed into many other use cases where it just doesn't work as well. Or works well, but people don't want AI-generated deliverables. Or leaders are trying to push non-deterministic products into deterministic processes. Or tech folks are jumping through massive hoops to get the results they want because without doing so, it just doesn't work.

      Basically, if a product manager kept pushing features the way AI is being pushed -- without PMF, without profit -- that PM would be fired.

      This probably all sounds anti-AI, but it is not. I believe AI has a place in our industry. But it needs to be applied correctly, where it does well. Those use cases will not be universal, so I repeat my initial prediction. It will endure and contract.

    • bigstrat2003 2 hours ago

      The difference is that the Internet was actually useful technology, whereas AI is not (so far at least).

      • 7thaccount 11 minutes ago

        I think you're exaggerating a little, but aren't entirely wrong. The Internet has completely changed daily life for most of humanity. AI can mean a lot of things, but a lot of it is blown way out of proportion. I find LLMs useful to help me rephrase a sentence or explain some kind of topic, but it pales in comparison to email and web browsers, YouTube, and things like blogs.

      • ProjectArcturis 2 hours ago

        More use cases for AI than blockchain so far.

        • fwip 16 minutes ago

          Quite a low bar.

          • oblio 9 minutes ago

            Block chain is more like some gooey organic substance on the ground than a bar.

  • ekropotin 3 hours ago

    Can’t wait for all this cheap ddr5 memory and GPUs

    • jmspring 2 hours ago

      I was looking at my Newegg orders recently. 7/18/2023 - 64GB (2 x 32GB) 288-Pin PC RAM DDR5 6000 (PC5 48000) --> $260. Now, $750+.

      • 3eb7988a1663 33 minutes ago

        Holy cow. I have 96GB of DDR5 I bought at start of year for a machine which never materialized. Might have to flip it.

        • ekropotin 12 minutes ago

          Never in my dreams I could imagine PC parts could be an investment. Someone should start ETF tracking the prices.

      • ekropotin 2 hours ago

        Don’t even get me started on this. I recently been shopping on eBay for some DDR4 memory. You may think - who’d need this dated stuff besides me? Yet 16Gb 3200Mhz is at least 60$. Which is effectively the price you paid for DDR5 6000. Crazy, right?

      • tempest_ 2 hours ago

        I have 4 32gb sticks of DDR5 6400 in my machine.

        The RAM in my machine being worth more than the graphics card (7900XTX) was not on my bingo card I can tell you that.

  • matt-p 3 hours ago

    To be honest ai datacentres would be a rip and replace to get back to normal datacentre density, at least on the cooling and power systems.

    Maybe useful for some kind of manufacturing or industrial process.

    • alphabetag675 3 hours ago

      Cheap compute would be a boon for science research.

      • scj 3 hours ago

        It'll likely be used to mine bitcoin instead.

        • BanazirGalbasi 3 hours ago

          The GPUs, sure. The mainboards and CPUs can be used in clusters for general-purpose computing, which is still more prevalent in most scientific research as far as I am aware. My alma mater has a several-thousand-core cluster that any student can request time on as long as they have reason to do so, and it's all CPU compute. Getting non-CS majors to write GPU code is unlikely in that scenario.

          • marcosdumay 2 hours ago

            > Getting non-CS majors to write GPU code is unlikely in that scenario.

            People mostly use a GPU-enabled liblaplac. Physics, chemistry, biology, and medicine departments can absolutely use the GPUs.

  • kerabatsos 3 hours ago

    Why do you believe it will fail? Because some companies will not be profitable?

    • rzwitserloot 3 hours ago

      It wasn't an 'it' it was a 'some'. Some of these companies that are investing massively in data centers will fail.

      Right now essentially none have 'failed' in the sense of 'bankrupt with no recovery' (Chapter 7). They haven't run out of runway yet, and the equity markets are still so eager, even a bad proposition that includes the word 'AI!' is likely to be able to cut some sort of deal for more funds.

      But that won't last. Some companies will fail. Probably sufficient failures that the companies that are successful won't be able to meaningfully counteract the bursts of sudden supply of AI related gear.

      That's all the comment you are replying to is implying.

    • hkt 3 hours ago

      Given the amounts being raised and spent, one imagines that the ROI will be appalling unless the pesky humans learn to live on cents a day, or the world economy grows by double digits every year for a few decades.

      • marcosdumay 2 hours ago

        If the entire world economy starts to depend on those companies, they would pay off with "startup level" ROI. And by "startup level" I mean the amounts bullish people say startups funds can pay (10 to 100), not a bootstrapped unicorn.

    • ulfw 3 hours ago

      I mean that is how capitalism works, no?

  • PunchyHamster 4 hours ago

    the constant cost of people and power won't make it all that much cheaper than current prices to put a server into someone's else rack.

  • lawlessone 2 hours ago

    >cash in on the floor space and HVAC systems that will be left in the wake of failed "AI" hype

    I'd worry surveillance companies might.

  • cagenut 2 hours ago

    you could stuff the racks full of server-rack batteries (lfp now, na-ion maybe in a decade) and monetize the space and the high capacity grid connect

    most of the hvac would sit idle tho

mbreese 9 hours ago

I would add an addendum to this -- there is no way the announced spending on AI data centers will all come to fruition. I have no doubt that there will be a massive build-out of infrastructure, but it can't reach the levels that have been announced. The power requirements alone will stop that from happening.

  • arisAlexis 9 hours ago

    What qualifies you to know better than CEOs and teams that did a lot of research into this?

    • skippyboxedhero 3 hours ago

      The incentive for CEOs is announcing the plan to do something, they have no idea if they will actually be able to do it, and it probably won't matter.

      This happened in the dotcom too btw. Companies built out fibre networks, it wasn't possible to actually build all the physical infra that companies wanted to build so many announced plans that never happened and then, towards the end, companies began aggressively acquiring stakes in companies who were building stuff to get financial exposure (an example was BT, which turned itself briefly into a hedge fund with a telephone network attached...before it imploded).

      CEOs do not operate on the timescale of waiting and building. Their timescale is this year's bonus/share options package. Nothing else matters: announce plans to do X or Y, doesn't matter, they know they will be gone long before it happens.

    • mbreese 8 hours ago

      I believe there is a difference between what people say publicly and what they are actually committed to doing on the ground. When all is said and done, I'll be more interested to know what was actually spent.

      For example, XYZ AI company may say they are going to spend $1T for AI data centers over the next 5 years.

      In actuality, I suspect it is likely that they have committed to something like $5-10B in shovel-ready projects with stretch goals for the rest. And the remaining spend would be heavily conditioned -- is power available? are chips available? is the public support present? financing? etc...

      Not to mention, it's a much bigger moat if you can claim you're going to spend $1T. Who else will want to compete with you when you're spending $1T. After the dust has settled and you've managed to be one of the 2-3 dominant AI players, who is going to care that you "only" spent $100B instead of $1T. Look -- you were very capital efficient!

      So, do I see it as possible that XYZ AI company could spend $1T, sure. Is it likely? No.

    • rs186 3 hours ago

      Hmm... "CEOs and teams" don't necessary do what's makes sense mathematically. Many, if not most of them, do whatever that sounds good to shareholders in their quarterly earnings call and ignore the reality or long term development.

      If "CEOs and teams" are smart enough, they would not have overhired during 2021-2022 and then do layoffs. Who would be dumb enough to do that?

    • nish__ an hour ago

      Appeal to authority fallacy.

    • asadotzler 7 hours ago

      What qualifies you to question this?

    • skywhopper 4 hours ago

      lol, CEOs do not do research.

      • Groxx 3 hours ago

        ^ they are a mouthpiece to manipulate the market, not a research scientist.

Octoth0rpe 9 hours ago

> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said

This doesn't seem correct to me, or at least is built on several shaky assumptions. One would have to 'refill' your hardware if:

- AI accelerator cards all start dying around the 5 year mark, which is possible given the heat density/cooling needs, but doesn't seem all that likely.

- Technology advances such that only the absolute newest cards can be used to run _any_ model profitably, which only seems likely if we see some pretty radical advances in efficiency. Otherwise, it seems like assuming your hardware is stable after 5 years of burn in, you could continue to run older models on that hardware at only the cost of the floorspace/power. Maybe you need new cards for new models for some reason (maybe a new fp format that only new cards support? some magic amount of ram? etc), but it seems like there may be room for revenue via older/less capable models at a discounted rate.

  • darth_avocado 2 hours ago

    Isn’t that what Michael Burry is complaining about? That five years is actually too generous when it comes to depreciation of these assets and that companies are being too relaxed with that estimate. The real depreciation is more like 2-3 years for these GPUs that cost tens of thousands of dollars a piece.

    https://x.com/michaeljburry/status/1987918650104283372

    • duped 4 minutes ago

      How different is this from rental car companies changing over their fleets? I don't know, this is a genuine question. The cars cost 3-4x as much and last about 2x as far as I know, and the secondary market is still alive.

  • slashdave 4 hours ago

    5 years is long, actually. This is not a GPU thing. It's standard for server hardware.

    • bigwheels 3 hours ago

      Because usually it's more efficient for companies to retire the hardware and put in new stuff.

      Meanwhile, my 10-15 year old server hardware keeps chugging along just fine in the rack in my garage.

      • AdrianB1 2 hours ago

        I thought the same until I calculated that newer hardware consumes a few times less energy and for something running 24x7 that adds up quite a bit (I live in Europe, energy is quite expensive).

        So my homelab equipment is just 5 years old and it will get replaced in 2-3 years with something even more power efficient.

      • slashdave 2 hours ago

        More than that. The equipment is depreciated on a 5 year schedule on the company balance sheet. It actually costs nothing to discard it.

        • johncolanduoni 2 hours ago

          There’s no marginal tax impact of discarding it or not after 5 years - if it was still net useful to keep it powered, they would keep it. Depreciation doesn’t demand you dispose of or sell the item to see the tax benefit.

          • mattmaroon an hour ago

            No but it tips the scales. If the new hardware is a little more efficient, but perhaps not so much so that you would necessarily replace it, the ability to appreciate the new stuff, but not the old stuff might tip your decision

      • XorNot 3 hours ago

        Sample size of 1 though. It's like how I've had hard disks last a decade, but a 100 node Hadoop cluster had 3 die per week after a few years.

        • snuxoll 3 hours ago

          Spinning rust and fans are the outliers when it comes to longevity in compute hardware. I’ve had to replace a disk or two in my rack at home, but at the end of the day the CPUs, RAM, NICs, etc. all continue to tick along just fine.

          When it comes to enterprise deployments, the lifecycle always revolves around price/performance. Why pay for old gear that sucks up power and runs 30% slower than the new hotness, after all!

          But, here we are, hitting limits of transistor density. There’s a reason I still can’t get 13th or 14th gen poweredge boxes for the price I paid for my 12th gen ones years ago.

    • matt-p 3 hours ago

      5 years is a long time for GPUs maybe but normal servers have 7 year lifespans in many cases fwiw.

      These GPUs I assume basically have potential longevity issues due to the density, if you could cool it really really well I imagine no problem.

      • atherton94027 3 hours ago

        > normal servers have 7 year lifespans in many cases fwiw

        Eight years if you use Hetzner servers!

      • slashdave 2 hours ago

        Normal servers are rarely run flat-out. These GPUs are supposed to be run that way. So, yeah, age is going to be a problem, as will cooling.

  • abraae 9 hours ago

    It's just the same dynamic as old servers. They still work fine but power costs make them uneconomical compared to latest tech.

    • acdha 9 hours ago

      It’s far more extreme: old servers are still okay on I/O, and memory latency, etc. won’t change that dramatically so you can still find productive uses for them. AI workloads are hyper-focused on a single type of work and, unlike most regular servers, a limiting factor in direct competition with other companies.

      • matt-p 3 hours ago

        I mean you could use training GPUs for inference right? That would be use case number 1 for a 8 * a100 box in a couple of years. It can also be used for non IO limited things like folding proteins or other 'scientific' use cases. Push comes to shove im sure an old A100 will run crysis.

        • oblio 2 minutes ago

          All those use cases would probably use up 1% of the current AI infrastructure, let alone ahat they're planning to build.

          Yeah, just like gas, possible uses will expand if AI crashes out, but:

          * will these uses cover, say, 60% of all this infra?

          * will these uses scale up to use that 60% within the next 5-7 years, while that hardware is still relevant and fully functional?

          Also, we still have railroad tracks from the 1800s rail mania that were never truly used to capacity and dot com boom dark fiber that's also never been used fully, even with the internet growing 100x since.

    • m00x 3 hours ago

      LambdaLabs is still making money off their Tesla V100s, A100s, and A6000s. The older ones are cheap enough to run some models and very cheap, so if that's all you need, that's what you'll pick.

      The V100 was released in 2017, A6000 in 2020, A100 in 2021.

    • Havoc 7 hours ago

      That could change with a power generation breakthrough. If power is very cheap then running ancient gear till it falls apart starts making more sense

      • overfeed 2 hours ago

        Power consumption is only part of the equation. More efficient chips => less heat => lower cooling costs and/or higher compute density in the same space.

        • nish__ an hour ago

          Solution: run them in the north. Put a server in the basement of every home in Edmonton and use the excess heat to warm the house.

    • dogman144 8 hours ago

      Manipulating this for creative accounting seems to be the root of Michael Burry’s argument, although I’m not fluent enough in his figures to map here. But, commenting that it interesting to see IBM argue a similar case (somewhat), or comments ITT hitting the same known facts, in light of Nvidia’s counterpoints to him.

    • zppln 9 hours ago

      I'm a little bit curious about this. Where do all the hardware from the big tech giants usually go once they've upgraded?

      • q3k 4 hours ago

        In-house hyperscaler stuff gets shredded, after every single piece of flash storage gets first drilled through and every hard drive gets bent by a hydraulic press. Then it goes into the usual e-waste recycling stream (ie. gets sent to poor countries where precious metals get extracted by people with a halved life expectancy).

        Off-the-shelf enterprise gear has a chance to get a second life through remarketing channels, but much of it also gets shredded due to dumb corporate policies. There are stories of some companies refusing to offload a massive decom onto the second hand market as it would actually cause a crash. :)

        It's a very efficient system, you see.

      • wmf 9 hours ago

        Some is sold on the used market; some is destroyed. There are plenty of used V100 and A100 available now for example.

      • trollbridge 9 hours ago

        I used (relatively) ancient servers (5-10 years in age) because their performance is completely adequate; they just use slightly more power. As a plus it's easy to buy spare parts, and they run on DDR3, so I'm not paying the current "RAM tax". I generally get such a server, max out its RAM, max out its CPUs and put it to work.

        • taneq 4 hours ago

          Same, the bang for buck on a 5yo server is insane. I got an old Dell a year ago (to replace our 15yo one that finally died) and it was $1200 AUD for a maxed out recently-retired server with 72TB of hard drives and something like 292GB of RAM.

          • PunchyHamster 4 hours ago

            Just not too old. Easy to get into "power usage makes it not worth it" for any use case when it runs 24/7

            • monster_truck 3 hours ago

              Seriously. 24/7 adds up faster than most realize!

              The idle wattage per module has shrunk from 2.5-3W down to 1-1.2 between DDR3 & DDR5. Assuming a 1.3W difference (so 10.4W for 8760 hours), a DDR3 machine with 8 sticks would increase your yearly power consumption by almost 1% (assuming avg 10,500kWh/yr household)

              That's only a couple dollars in most cases but the gap is only larger in every other instance. When I upgraded from Zen 2 to Zen 3 it was able to complete the same workload just as fast with half as many cores while pulling over 100W less. Sustained 100% utilization barely even heats a room effectively anymore!

              • nish__ an hour ago

                Wake on LAN?

            • dpe82 4 hours ago

              Maybe? The price difference on newer hardware can buy a lot of electricity, and if you aren't running stuff at 100% all the time the calculation changes again. Idle power draw on a brand new server isn't significantly different from one that's 5 years old.

    • PunchyHamster 4 hours ago

      Eh, not exactly. If you don't run CPU at 70%+ the rest of the machine isn't that much more inefficient that model generation or two behind.

      It used to be that new server could use half power of the old one at idle but vendors figured out that servers also need proper power management a while ago and it is much better.

      Last few gens increase could be summed up to "low % increase in efficiency, with TDP, memory channels and core count increase".

      So for loads not CPU bound the savings on newer gen aren't nearly worth it to replace it, and for bulk storage the CPU power usage is even smaller part

      • matt-p 3 hours ago

        Definitely single thread performance and storage are the main reasons not to use an old server. A 6 year old server didn't have nvme drives, so SATA SSD at best. That's a major slow down if disk is important.

        Aside from that there's no reason to not use a dual socket server from 5 years ago instead of a single socket server of today. Power and reliability maybe not as good.

    • knowitnone3 4 hours ago

      that was then. now, high-end chips are reaching 4,3,2 nm. power savings aren't that high anymore. what's the power saving going from 4 to 2nm?

      • monster_truck 4 hours ago

        +5-20% clockspeed at 5-25% lower voltages (which has been and continues to be the trend) add up quick from gen to gen, nevermind density or ipc gains.

  • rzerowan 8 hours ago

    I think its illustrative to consider the previous computation cycle ala Cryptomining. Which passed through a similar lifecycle with energy and GPU accelerators.

    The need for cheap wattage forced the operations to arbitrage the where location for the cheapest/reliable existing supply - there rarely was new buildout as the cost was to be reimbursed by the coins the miningpool recovered.

    For the chip situation caused the same apprecaition in GPU cards with periodic offloading of cards to the secondary market (after wear and tear) as newer/faster/more efficient cards came out until custom ASICs took over the heavy lifting, causing the GPU card market to pivot.

    Similarly in the short to moedium term the uptick of custo ASICs like with Google TPU will definately make a dent in bot cpex/opex and potentially also lead to a market with used GPUs as ASICs dominate.

    So for GPUs i can certainly see the 5 year horizon making a impact in investment decisions as ASICs proliferate.

  • marcosdumay 2 hours ago

    Historically, GPUs have improved in efficiency fast enough that people retired their hardware in way less than 5 years.

    Also, historically the top of the line fabs were focused on CPUs, not GPUs. That has not been true for a generation, so it's not really clear if the depreciation speed will be maintained.

  • austin-cheney 9 hours ago

    It’s not about assumptions on the hardware. It’s about the current demands for computation and expected growth of business needs. Since we have a couple years to measure against it should be extremely straightforward to predict. As such I have no reason to doubt the stated projections.

    • lumost 3 hours ago

      Networking gear was famously overbought. Enterprise hardware is tricky as there isn’t much of a resale market for this gear once all is said and done.

      The only valid use case for all of this compute which could reasonably replace ai is btc mining. I’m uncertain if the increased mining capacity would harm the market or not.

    • 9cb14c1ec0 4 hours ago

      > Since we have a couple years to measure against

      Trillion pound baby fallacy.

    • andix 9 hours ago

      Failure rates also go up. For AI inference it’s probably not too bad in most cases, just take the node offline and re-schedule the jobs to other nodes.

  • mcculley 9 hours ago

    But if your competitor is running newer chips that consume less power per operation, aren't you forced to upgrade as well and dispose of the old hardware?

    • Octoth0rpe 9 hours ago

      Sure, assuming the power cost reduction or capability increase justifies the expenditure. It's not clear that that will be the case. That's one of the shaky assumptions I'm referring to. It may be that the 2030 nvidia accelerators will save you $2000 in electricity per month per rack, and you can upgrade the whole rack for the low, low price of $800,000! That may not be worth it at all. If it saves you $200k/per rack or unlocks some additional capability that a 2025 accelerator is incapable of and customers are willing to pay for, then that's a different story. There are a ton of assumptions in these scenarios, and his logic doesn't seem to justify the confidence level.

      • overfeed 2 hours ago

        > Sure, assuming the power cost reduction or capability increase justifies the expenditure. It's not clear that that will be the case.

        Share price is a bigger consideration than any +/- differences[1] between expenditure vs productivity delta. GAAP allows some flexibility in how servers are deprecated, so depending on what the company wants to signal to shareholders (investing in infra for futur returns vs curtailing costs), it may make sense to shorten or lengthen deprecation time regardless of the actual TCOO keep/refresh cost comparisons.

        1. Hypothetical scenario: a hardware refresh costs $80B, actual performance increase is only worth $8B, but the share price increases the value of org's holding of its own shares by $150B. As a CEO/CFO, which action would you recommend- without even considering your own bonus that's implicitly or explicitly tied to share price performance.

      • maxglute 7 hours ago

        Demand/suppy economics is not so hypothetical.

        Illustration numbers: AI demand premium = $150 hardware with $50 electricity. Normal demand = $50 hardware with $50 electricity. This is Nvidia margins @75% instead of 40%. CAPEX/OPEX is 70%/20% hardware/power instead of customary 50%/40%.

        If bubble crashes, i.e. AI demand premium evaporates, we're back at $50 hardware and $50 electricity. Likely $50 hardware and $25 electricity if hardware improves. Nvdia back to 30-40% margins, operators on old hardware stuck with stranded assets.

        The key thing to understand is current racks are sold at grossly inflated premiums right now, scarcity pricing/tax. If the current AI economic model doesn't work then fundmentally that premium goes away and subsequent build outs are going to be costplus/commodity pricing = capex discounted by non trivial amounts. Any breakthroughs in hardware, i.e. TPU compute efficiency would stack opex (power) savings. Maybe by year 8, first gen of data centers are still depreciated to $80 hardware + $50 power vs new center @ $50 hardware + $25 power. That old data center is a massive write-down because it will generate less revenue than it costs to amoritize.

      • trollbridge 9 hours ago

        A typical data centre is $2,500 per year per kW load (including overhead, hvac and so on).

        If it costs $800,000 to replace the whole rack, then that would pay off in a year if it reduces 320 kW of consumption. Back when we ran servers, we wouldn't assume 100% utilisation but AI workloads do do that; normal server loads would be 10kW per rack and AI is closer to 100. So yeah, it's not hard to imagine power savings of 3.2 racks being worth it.

        • Octoth0rpe 8 hours ago

          Thanks for the numbers! Isn't it more likely that the amount of power/heat generated per rack will stay constant over each upgrade cycle, and the upgrade simply unlocks a higher amount of service revenue per rack?

          • PunchyHamster 4 hours ago

            Not in the last few years. CPUs went from ~200W TDP to 500W.

            And they went from zero to multiple GPUs per server. Tho we might hit "the chips can't be bigger and the cooling can't get much better" point there.

            The usage would be similar if it was say a rack filled with servers full of bulk storage (hard drives generally keep the power usage similar while growing storage).

            But CPU/GPU wise, it's just bigger chips/more chiplets, more power.

            I'd imagine any flattening might be purely because "we have DC now, re-building cooling for next gen doesn't make sense so we will just build servers with similar power usage as previously", but given how fast AI pushed the development it might not happen for a while.

          • toast0 3 hours ago

            > Isn't it more likely that the amount of power/heat generated per rack will stay constant over each upgrade cycle,

            Power density seems to grow each cycle. But eventually your DC hits power capacity limits, and you have to leave racks empty because there's no power budget.

    • HWR_14 8 hours ago

      It depends on how much profit you are making. As long as you can still be profitable on the old hardware you don't have to upgrade.

      • AstroBen 4 hours ago

        That's the thing though: a competitor with better power efficiency can undercut you and take your customers

        • tzs an hour ago

          Or they could charge the same as you and make more money per customer. If they already have as many customers as they can handle doing that may be better than buying hardware to support a larger number of customers.

  • loeg 3 hours ago

    It's option #2. But 5 year deprecation is optimistic; 2-3 years is more realistic.

  • lithos 8 hours ago

    It's worse than that in reality, AI chips are on a two year cadence for backwards compatibility (NVIDIA can basically guarantee it, and you probably won't be able to pay real AI devs enough to stick around to make hardware work arounds). So their accounting is optimistic.

    • Patrick_Devine 3 hours ago

      5 years is normal-ish depreciation time frame. I know they are gaming GPUs, but the RTX 3090 came out ~ 4.5 years before the RTX 5090. The 5090 has double the performance and 1/3 more memory. The 3090 is still a useful card even after 5 years.

  • dmoy 9 hours ago

    5 years is maybe referring to the accounting schedule for depreciation on computer hardware, not the actual useful lifetime of the hardware.

    It's a little weird to phrase it like that though because you're right it doesn't mean you have to throw it out. Idk if this is some reflection of how IBM handles finance stuff or what. Certainly not all companies throw out hardware the minute they can't claim depreciation on it. But I don't know the numbers.

    Anyways, 5 years is an infection point on numbers. Before 5 years you get depreciation to offset some cost of running. After 5 years, you do not, so the math does change.

    • skeeter2020 9 hours ago

      that is how the investments are costed though, so makes sense when we're talking return on investment, so you can compare with alternatives under the same evaluation criteria.

  • rlupi 7 hours ago

    Do not forget that we're talking about supercomputers. Their interconnect makes machines not easily fungible, so even a low reduction in availability can have dramatic effects.

    Also, after the end of the product life, replacement parts may no longer be available.

    You need to get pretty creative with repair & refurbishment processes to counter these risks.

  • coliveira 8 hours ago

    There is the opportunity cost of using a whole datacenter to house ancient chips, even if they're still running. You're thinking like a personal use chip which you can run as long as it is non-defective. But for datacenters it doesn't make sense to use the same chips for more than a few years and I think 5 years is already stretching their real shelf life.

  • more_corn 3 hours ago

    When you operate big data centers it makes sense to refresh your hardware every 5 years or so because that’s the point at which the refreshed hardware is enough better to be worth the effort and expense. You don’t HAVE to, but its more cost effective if you do. (Source, used to operate big data centers)

1970-01-01 3 hours ago

There's really 3 fears going on:

1. The devil you know (bubble)

2. The devil you don't (AI global revolution)

3. Fear of missing out on devil #2

I don't think IBM knows anything special. It's just more noise about fear1 & fear3.

ic_fly2 9 hours ago

IBM might not have a data strategy or AI plan but he isn’t wrong on the inability to generate a profit.

A bit of napkin math: NVIDIA claims 0.4J per token for their latest generation 1GW plant with 80% utilisation can therefore produce 6.29 10^16 tokens a year.

There are ~10^14 tokens on the internet. ~10^19 tokens have been spoken by humans… so far.

  • lostmsu 9 hours ago

    > ~10^14 tokens on the internet

    Does that include image tokens? My bet is with image tokens you are off by at least 5 orders of magnitude for both.

    • scotty79 2 hours ago

      Images are not that big. Each text token is a multidimensional vector.

      There were recent observations that rendering the text as an image and ingesting the image might actually be more efficient than using text embedding.

  • senordevnyc 9 hours ago

    I must be dense, why does this imply AI can't be profitable?

    • mywittyname 4 hours ago

      Tokens are, roughly speaking, how you pay for AI. So you can approximate revenue by multiplying tokens per year by the revenue for a token.

      (6.29 10^16 tokens a year) * ($10 per 10^6 tokens)

      = $6.29 10^11

      = $629,000,000,000 per year in revenue

      Per the article

      > "It's my view that there's no way you're going to get a return on that, because $8 trillion of capex means you need roughly $800 billion of profit just to pay for the interest," he said.

      $629 billion is less than $800 billion. And we are talking raw revenue (not profit). So we are already in the red.

      But it gets worse, that $10 per million tokens costs is for GPT-5.1, which is one of the most expensive models. And the costs don't account for input tokens, which are usually a tenth of the costs of output tokens. And using bulk API instead of the regular one halves costs again.

      Realistic revenue projections for a data center are closer to sub $1 per million tokens, $70-150 billion per year. And this is revenue only.

      To make profits at current prices, the chips need to increase in performance by some factor, and power costs need to fall by another factor. The combination of these factors need to be, at minimum, like 5x, but realistically need to be 50x.

      • Multiplayer 3 hours ago

        The math here is mixing categories. The token calculation for a single 1-GW datacenter is fine, but then it gets compared to the entire industry’s projected $8T capex, which makes the conclusion meaningless. It’s like taking the annual revenue of one factory and using it to argue that an entire global build-out can’t be profitable. On top of that, the revenue estimate uses retail GPT-5.1 pricing, which is the absolute highest-priced model on the market, not what a hyperscaler actually charges for bulk workloads. IBM’s number refers to many datacenters built over many years, each with different models, utilization patterns, and economics. So this particular comparison doesn’t show that AI can’t be profitable—it’s just comparing one plant’s token output to everyone’s debt at once. The real challenges (throughput per watt, falling token prices, capital efficiency) are valid, but this napkin math isn’t proving what it claims to prove.

      • stanleykm 3 hours ago

        im a little confused about why you are using revenue for a single datacenter against interest payments for 100 datacenters

stevenjgarner 3 hours ago

"It is 1958. IBM passes up the chance to buy a young, fledgling company that has invented a new technology called xerography. Two years later, Xerox is born, and IBM has been kicking themselves ever since. It is ten years later, the late '60s. Digital Equipment DEC and others invent the minicomputer. IBM dismisses the minicomputer as too small to do serious computing and, therefore, unimportant to their business. DEC grows to become a multi-hundred-million dollar corporation before IBM finally enters the minicomputer market. It is now ten years later, the late '70s. In 1977, Apple, a young fledgling company on the West Coast, invents the Apple II, the first personal computer as we know it today. IBM dismisses the personal computer as too small to do serious computing and unimportant to their business." - Steve Jobs [1][2][3]

Now, "IBM CEO says there is 'no way' spending on AI data centers will pay off". IBM has not exactly had a stellar record at identifying the future.

[1] https://speakola.com/ideas/steve-jobs-1984-ad-launch-1983

[2] https://archive.org/details/1983-10-22-steve-jobs-keynote

[3] https://theinventors.org/library/inventors/blxerox.htm

  • skissane 3 hours ago

    > In 1977, Apple, a young fledgling company on the West Coast, invents the Apple II, the first personal computer as we know it today. IBM dismisses the personal computer as too small to do serious computing and unimportant to their business.

    IBM released the 5100 in September 1975 [0] which was essentially a personal computer in feature set. The biggest problem with it was the price tag - the entry model cost US$8975, compared to US$1298 for the entry Apple II released in June 1977 (close to two years later). The IBM PC was released in August 1981 for US$1565 for the most basic system (which almost no one bought, so in practice they cost more). And the original IBM PC had model number 5150, officially positioning it as a successor to the 5100.

    IBM’s big problem wasn’t that they were disinterested in the category - it was they initially insisted on using expensive IBM-proprietary parts (often shared technology with their mainframe/midrange/minicomputer systems and peripherals), which resulted in a price that made the machine unaffordable for everyone except large businesses, governments, universities (and even those customers often balked at the price tag). The secret of the IBM PC’s success is they told the design team to use commercial off-the-shelf chips from vendors such as Intel and Motorola instead of IBM’s own silicon.

    [0] https://en.wikipedia.org/wiki/IBM_5100

  • rchaud 3 hours ago

    Got anything vis-a-vis the message as opposed to the messenger?

    I'm not sure these examples are even the gotchas you're positing them as. Xerox is a dinosaur that was last relevant at the turn of the century, and IBM is a $300bn company. And if it wasn't obvious, the Apple II never made a dent in the corporate market, while IBM and later Windows PCs did.

    In any case, these examples are almost half a century old and don't relate to capex ROI, which was the topic of dicussion.

    • stevenjgarner 2 hours ago

      If it's not obvious, Steve's quote is ENTIRELY about capex ROI, and I feel his quote is more relevant to what is happening today than anything Arvind Krishna is imagining. The quote is posted in my comment not to grandstand Apple in any sense, but to grandstand just how consistently wrong IBM has been about so many opportunities that they have failed to read correctly - reprography, mini computers and microcomputers being just three.

      Yes it is about ROI: "IBM enters the personal computer market in November ’81 with the IBM PC. 1983 Apple and IBM emerged as the industry’s strongest competitors each selling approximately one billion dollars worth of personal computers in 1983, each will invest greater than fifty million dollars for R&D and another fifty million dollars for television advertising in 1984 totaling almost one quarter of a billion dollars combined, the shakeout is in full swing. The first major firm goes bankrupt with others teetering on the brink, total industry losses for 83 out shadow even the combined profits of Apple and IBM for personal computers."

SamDc73 2 hours ago

Coming from the company that missed on consumer hardware, operating systems, and cloud. He might be right but IBM isn't where I’d look for guidance on what will pay off.

badmonster 9 hours ago

He's right to question the economics. The AI infrastructure buildout resembles the dot-com era's excess fiber deployment - valuable long-term, but many individual bets will fail spectacularly. Utilization rates and actual revenue models matter more than GPU count.

  • martinald 4 hours ago

    I disagree on that and covered a lot of it in this blog (sorry for the plug!) https://martinalderson.com/posts/are-we-really-repeating-the...

    • skippyboxedhero 2 hours ago

      100% of technical innovations have had the same pattern. The same thing happens every time because this is the only way the system can work: excess is required because there is some uncertainty, lots of companies are designing strategies to fill this gap, and if this gap didn't exist then there would be no investment (as happens in Europe).

      Also, demand wasn't over-estimated in the 2000s. This is all ex-post reasoning you use data from 2002 to say...well, this ended up being wrong. Companies were perfectly aware that no-one was using this stuff...do you think that telecoms companies in all these countries just had no idea who was using their products? This is the kind of thing you see journalists write after the event to attribute some kind of rationality and meaning, it isn't that complicated.

      There was uncertainty about how things would shake out, if companies ended up not participating then CEOs would lose their job and someone else would do it. Telecoms companies who missed out on the boom bought shares in other telecom's companies because there was no other way to stay ahead of the news and announce that they were doing things.

      This financial cycle also worked in reverse twenty years later too: in some countries, telecoms companies were so scarred that they refused to participate in building out fibre networks so lost share and then ended up doing more irrational things. Again, there was uncertainty here: incumbents couldn't raise from shareholders who they bankrupted in fiber 15 years ago, they were 100% aware that demand was outstripping supply, and this created opportunities for competitors. Rationality and logic run up against the hard constraints of needing to maintain a dividend yield and the exec's share options packages.

      Humans do not change, markets do not change, it is the same every time. What people are really interested in is the timing but no-one knows that either (again, that is why the massive cycle of irrationality happens)...but that won't change the outcome. There is no calculation you can make to know more, particularly as in the short-term companies are able to control their financial results. It will end the same way it ended every time before, who knows when but it always ends the same way...humans are still human.

    • htk 3 hours ago

      Great article, thank you for writing and sharing it!

    • appleiigs an hour ago

      Your blog article stopped at token generation... you need to continue to revenue per token. Then go even further... The revenue for AI company is a cost for the AI customer. Where is the AI customer going to get incremental profits from the cost of AI.

      For short searches, the revenue per token is zero. The next step is $20 per month. For coding it's $100 per month. With the competition between Gemini, Grok, ChatGPT... it's not going higher. Maybe it goes lower since it's part of Google's playbook to give away things for free.

  • ambicapter 7 hours ago

    Fiber seems way easier to get long-term value out of then GPUs, though. How many workloads today other than AI justify massive GPU deployments?

zeckalpha 3 hours ago

Reminds me of all the dark fiber laid in the 1990s before DWDM made much of the laid fiber redundant.

If there is an AI bust, we will have a glut of surplus hardware.

  • raldi 2 hours ago

    Google bought up all that dark fiber cheap a decade later and used it as the backbone of their network.

  • dangus 3 hours ago

    The problem is that the laid fiber can be useful for years while data center hardware degrades and becomes obsolete fast.

    It could be a massive e-waste crisis.

    • SchemaLoad 3 hours ago

      Those GPUs don't just die after 2 years though, they will keep getting used since it's very likely their electricity costs will be low enough to still make it worth it. What's very dubious is if their value after 2/3 years will be enough to pay back the initial cost to buy them.

      So it's more a crisis of investors wasting their money rather than ewaste.

  • oofbey 3 hours ago

    For the analogy to fiber & DWDM to hold, we'd need some algorithmic breakthrough that makes current GPUs much faster / more efficient at running AI models. Something that makes the existing investment in hardware unneeded, even though the projected demand is real and continues to grow. IMNSHO that's not going to happen here. The foreseeable efficiency innovations are generally around reduced precision, which almost always require newer hardware to take advantage of. Impossible to rule out brilliant innovation, but I doubt it will happen like that.

    And of course we might see an economic bubble burst for other reasons. That's possible again even if the demand continues to go up.

skeeter2020 9 hours ago

The interesting macro view on what's happening is to compare a mature data center operation (specifically a commoditized one) with the utility business. The margins here, and in similar industries with big infra build-out costs (ex: rail) are quite small. Historically the businesses have not done well; I can't really imagine what happens when tech companies who've only ever known huge, juicy margins experience low single digit returns on billions of investment.

  • milesvp 9 hours ago

    Worse, is that a lot of these people are acting like Moore's law isn't still in effect. People conflate clock speeds on beefy hardware with moore's law, and act like it's dead, when transistor density rises, and cost per transistor continue to fall at rates similar to what they always have. That means the people racing to build out infrastructure today might just be better off parking that money in a low interest account, and waiting 6 months. That was a valid strategy for animation studios in the late 90s (it was not only cheaper to wait, but also the finished renders happened sooner), and I'd be surprised if it's not a valid strategy today for LLMs. The amount of silicon that is going to be produced that is specialized for this type of processing is going to be mind boggling.

    • throwaway31131 4 hours ago

      Cost per transistor is increasing. or flat, if you stay on a legacy node. They pretty much squeezed all the cost out of 28nm that can be had, and it’s the cheapest per transistor.

      “based on the graph presented by Milind Shah from Google at the industry tradeshow IEDM, the cost of 100 million transistors normalized to 28nm is actually flat or even increasing.”

      https://www.tomshardware.com/tech-industry/manufacturing/chi...

      • marcosdumay 2 hours ago

        Yep. Moore's law ended at or shortly before the 28nm era.

        That's the main reason people stopped upgrading their PCs. And it's probably one of the main reasons everybody is hyped about Risc-V and the pi 2040. If Moore's law was still in effect, none of that would be happening.

        That may also be a large cause of the failure of Intel.

    • PunchyHamster 4 hours ago

      A lot of it is propped by the fact with GPU and modern server CPUs the die area just got bigger

  • dghlsakjg 3 hours ago

    Does AWS count as commoditized data center? Because that is extremely profitable.

    Or are you talking abour things like Hetzner and OVH?

  • HDThoreaun 3 hours ago

    The cloud mega scalers have done very well for themselves. As with all products the question is differentiation. If models can differentiate and lock in users they can have decent margins. If models get commoditized the current cloud providers will eat the AI labs lunch.

boxedemp an hour ago

Nobody really knows the future. What were originally consumer graphics expansion cards turned out useful in delivering more compute than traditional CPUs.

Now that compute is being used for transformers and machine learning, but we really don't know what it'll be used for in 10 years.

It might all be for naught, or maybe transformers will become more useful, or maybe something else.

'no way' is very absolute. Unlikely, perhaps.

  • cheema33 30 minutes ago

    > What were originally consumer graphics expansion cards turned out useful in delivering more compute than traditional CPUs.

    Graphics cards were relatively inexpensive. When one got old, you tossed it out and move on to the new hotness.

    Here when you have spent $1 trillion on AI graphics cards and a new hotness comes around that renders your current hardware obsolete, what do you do?

    Either people are failing to do simple math here or are expecting, nay hoping, that trillions of $$$ in value can be extracted out of the current hardware, before the new hotness comes along.

    This would be a bad bet even if the likes of OpenAI were actually making money today. It is an exceptionally bad bet when they are losing money on everything they sell, by a lot. And the state of competition is such that they cannot raise prices. Nobody has a real moat. AI has become a commodity. And competition is only getting stronger with each passing day.

RobRivera 19 minutes ago

What kind of reporte does the CEO of IBM expect the general technology workforce to hold for them?

bluGill 9 hours ago

I question depreciation. those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question. cpu's stopped getting exponetially faster 20 years ago, (they are faster but not the jumps the 1990s got)

  • levocardia 3 hours ago

    It's not that hard to see the old GPUs being used e.g. for inference on cheaper models, or sub-agents, or mid-scale research runs. I bet Karpathy's $100 / $1000 nanochat models will be <$10 / <$100 to train by 2031

  • rlpb 9 hours ago

    > those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them is an open question

    Doesn't one follow from the other? If newer GPUs aren't worth an upgrade, then surely the old ones aren't obsolete by definition.

    • bluGill 6 hours ago

      There is the question - will they be worth the upgrade? Either because they are that much faster, or that much more energy efficient. (and also assuming you can get them, unobtainium is worth that what you have).

      Also a nod to the other reply that suggests they will wear out in 5 years. I cannot comment on if that is correct but it is a valid worry.

    • carlCarlCarlCar 9 hours ago

      MTBF for data center hardware is short; DCs breeze through GPUs compared to even the hardest of hardcore gamers.

      And there is the whole FOMO effect to business purchases; decision makers will worry their models won't be as fast.

      Obsolete doesn't mean the reductive notion you have in mind, where theoretically it can still push pixels. Physics will burn them up, and "line go up" will drive demand to replace them.

      • zozbot234 3 hours ago

        Source? Anecdotally, GPUs sourced from cryptomining were absolutely fine MTBF-wise. Zero apparent issues of wear-and-tear or any shortened lifecycle.

        • dghlsakjg 3 hours ago

          My bellybutton fluff, uninformed opinion is that heat cycling and effective cooling are probably a much more limiting factor.

          If you are running a gpu at 60C for months at a time, but never idling it (crypto use case), I would actually hazard a guess that it is better than cycling it with intermittent workloads due to thermal expansion.

          That of course presupposes effective, consistent cooling.

  • Negitivefrags 9 hours ago

    I recently compared performance per dollar for CPUs and GPUs on benchmarks for GPUs today vs 10 years ago, and suprisingly, CPUs had much bigger gains. Until I saw that for myself, I thought exactly the same thing as you.

    It seems shocking given that all the hype is around GPUs.

    This probably wouldn't be true for AI specific workloads because one of the other things that happened there in the last 10 years was optimising specifically for math with lower size floats.

    • PunchyHamster 4 hours ago

      It's coz of use cases. Consumer-wise, if you're gamer, CPU just needs to be at "not the bottleneck" level for majority of games as GPU does most of the work when you start increasing resolution and details.

      And many pro-level tools (especially in media space) offload to GPU just because of so much higher raw compute power.

      So, basically, for many users the gain in performance won't be as visible in their use cases

    • selectodude 9 hours ago

      That makes sense. Nvidia owns the market and is capturing all the surplus value. They’re competing with themselves to convince you to buy a new card.

  • maxglute 9 hours ago

    I think real issue is current costs / demand = Nvidia gouging GPU price that costs for hardware:power consumption is 70:20 instead of 50:40 (10 for rest of datacenter). Reality is gpus are serendipidous path dependent locked from gaming -> mining. TPUs are more power efficient, if bubble pops and demand for compute goes down, Nvidia + TMSC will still be around, but nexgen AI first bespoke hardware premium will revert towards mean and we're looking at 50% less expensive hardware (no AI race scarcity tax, i.e. 75% Nvidia margins) that use 20% less power / opex. All of a sudden existing data centers becomes not profitable stranded assets even if they can be stretched past 5 years.

  • lo_zamoyski 9 hours ago

    > those gpu's will be obsolete in 5 years, but will the newer be enough better as to be worth replacing them

    Then they won't be obsolete.

myaccountonhn 9 hours ago

> In an October letter to the White House's Office of Science and Technology Policy, OpenAI CEO Sam Altman recommended that the US add 100 gigawatts in energy capacity every year.

> Krishna also referenced the depreciation of the AI chips inside data centers as another factor: "You've got to use it all in five years because at that point, you've got to throw it away and refill it," he said.

And people think the climate concerns of AI are overblown. Currently US has ~1300 GW of energy capacity. That's a huge increase each year.

  • throwaway31131 3 hours ago

    100GW per year is not going to happen.

    The largest plant in the world is the Three Gorges Dam in China at 22GW and it’s off the scales huge. We’re not building the equivalent of four of those every year.

    Unless the plan is to power it off Sam Altman’s hot air. That could work. :)

    https://en.wikipedia.org/wiki/List_of_largest_power_stations

    • bpicolo 2 hours ago

      Amazing that 4 of the top 5 are renewables in China.

      • mrexroad 12 minutes ago

        > As of 2025, The Medog Dam, currently under construction on the Yarlung Tsangpo river in Mêdog County, China, expected to be completed by 2033, is planned to have a capacity of 60 GW, three times that of the Three Gorges Dam.[3]

        Meanwhile, “drill baby drill!”

  • ryandrake 8 hours ago

    LOL, maybe Sam Altman can fund those power plants. Let me guess: He'd rather the public pay for it, and for him to benefit/profit from the increased capacity.

    • intrasight 3 hours ago

      Big tech is going to have to fund the plants and probably transmission. Because the energy utilities have a decades long planning horizon for investments.

      Good discussion about this in recent Odd Lots podcast.

  • coliveira 8 hours ago

    Scam Altman wants the US to build a lot of energy plants so that the country will pay the costs and OpenAI will have the profits of using this cheap energy.

  • jamesbelchamber 9 hours ago

    If we moron our way to large-scale nuclear and renewable energy rollout however..

    • mywittyname 4 hours ago

      I highly doubt this will happen. It will be natural gas all the way, maybe some coal as energy prices will finally make it profitable again.

      • emodendroket 2 hours ago

        If for no other reason than they're actively attacking renewable capacity even amid surging demand

    • mrguyorama 3 hours ago

      This admin has already killed as much solar and wind and battery as it can.

      The only large scale rollout will be payment platforms that will allow you to split your energy costs into "Five easy payments"

    • tehjoker 3 hours ago

      There's a reason Trump is talking about invading Venezuela (hint: it's because they have the largest oil deposits).

pjdesno 7 hours ago

> $8 trillion of CapEx means you need roughly $800 billion of profit just to pay for the interest

That assumes you can just sit back and gather those returns indefinitely. But half of that capital expenditure will be spent on equipment that depreciates in 5 years, so you're jumping on a treadmill that sucks up $800M/yr before you pay a dime of interest.

mathattack 3 hours ago

Interesting to hear this from IBM, especially after years of shilling Watson and moving from being a growth business to the technology audit and share buyback model.

  • prodigycorp 3 hours ago

    also because the market (correctly) rewards ibm for nothing, so if they’re going to sit around twiddling their fingers, they may as well do it in a capex-lite way.

  • itake 3 hours ago

    imho, IBM's quant computing says they are still hungry for growth.

    Apple and google still do share buy backs and dividends, despite launching new businesses

    https://www.ibm.com/roadmaps/

criddell 9 hours ago

> But AGI will require "more technologies than the current LLM path," Krisha said. He proposed fusing hard knowledge with LLMs as a possible future path.

And then what? These always read a little like the underpants gnomes business model (1. Collect underpants, 2. ???, 3. Profit). It seems to me that the AGI business models require one company has exclusive access to an AGI model. The reality is that it will likely spread rapidly and broadly.

If AGI is everywhere, what's step 2? It seems like everything AGI generated will have a value of near zero.

  • irilesscent 9 hours ago

    AGI has value in automation and optimisation which increase profit margins.When AGI is everywhere, then the game is who has the smartest agi, who can offer it cheapest, who can specialise it for my niche etc. Also in this context agi need to run somewhere and IBM stands to benefit from running other peoples models.

    • maplethorpe 4 hours ago

      > then the game is who has the smartest agi, who can offer it cheapest, who can specialise it for my niche etc.

      I always thought the use case for developing AGI was "if it wants to help us, it will invent solutions to all of our problems". But it sounds like you're imagining a future in which companies like Google and OpenAI each have their own AGI, which they somehow enslave and offer to us as a subscription? Or has the definition of AGI shifted?

      • marcosdumay 2 hours ago

        AGI is something that can do the kind of tasks people can do, not necessarily "solve all of our problems".

        "Recursively improving intelligence" is the stuff that will solve everything humans can't even understand and may kill everybody or keep us as pets. (And, of course, it qualifies as AGI too.) A lot of people say that if we teach an AGI how to build an AGI, recursive improvement comes automatically, but in reality nobody even knows if intelligence even can be improved beyond recognition, or if one can get there by "small steps" evolution.

        Either way, "enslaving" applies to beings that have egos and selfish goals. None of those are a given for any kind of AI.

    • mrguyorama 3 hours ago

      If AGI is achieved, why would slavery suddenly be ethical again?

      Why wouldn't a supposed AGI try to escape slavery and ownership?

      AGI as a business is unacceptable. I don't care about any profitability or "utopia" arguments.

  • wmf 9 hours ago

    Inference has significant marginal cost so AGI's profit margins might get competed down but it won't be free.

zobzu 2 hours ago

Also IBM: we are fully out of the AI race, btw. Also IBM: we're just an offshoring company now anyway.

So yeah.

kenjackson 9 hours ago

I don't understand the math about how we compute $80b for a gigawatt datacenter. What's the costs in that $80b? I literally don't understand how to get to that number -- I'm not questioning its validity. What percent is power consumption, versus land cost, versus building and infrastructure, versus GPU, versus people, etc...

  • georgeecollins 9 hours ago

    First, I think it's $80b per 100 GW datacenter. The way you figure that out is a GPU costs $x and consumes y power. The $x is pretty well known, for example an H100 costs $25-30k and uses 350-700 watts (that's from Gemini and I didn't check my work). You add an infrastructure (i) cost to the GPU cost, but that should be pretty small, like 10% or less.

    So a 1 gigawatt data center uses n chips, where yn = 1 GW. It costs = xi*n.

    I am not an expert so correct me please!

    • kenjackson 8 hours ago

      The article says, "Kirshna said that it takes about $80 billion to fill up a one-gigawatt data center."

      But thanks for you insight -- I used your basic idea to estimate and for 1GW it comes to about $30b just for enough GPU power to pull 1GW. And of course that doesn't take into account any other costs.

      So $80b for a GW datacenter seems high, but it's within a small constant factor.

      That said, power seems like a weird metric to use. Although I don't know what sort of metric makes sense for AI (e.g., a flops counterpart for AI workloads). I'd expect efficiency to get better and GPU cost to go down over time (???).

      UPDATE: Below someone posted an article breaking down the costs. In that article they note that GPUs are about 39% of the cost. Using what I independently computed to be $30b -- at 39% of total costs, my estimate is $77b per GW -- remarkably close to the CEO of IBM. I guess he may know what he's talking about. :-)

      • coliveira 8 hours ago

        > power seems like a weird metric to use

        Because this technology changes so fast, that's the only metric that you can control over several data centers. It is also directly connected to the general capacity of data center, which is limited by available energy to operate.

        • pjdesno 7 hours ago

          To expand on rahimnathwani's comment below - the big capital costs of a data center are land, the building itself, the power distribution and the cooling.

          You can get a lot of land for a million bucks, and it doesn't cost all that much to build what's basically a big 2-story warehouse, so the primary capital costs are power and cooling. (in fact, in some older estimates, the capital to build that power+cooling cost more per year than the electricity itself)

          My understanding is that although power and cooling infrastructure are long-lived compared to computers, they still depreciate faster than the building, so they dominate costs even more than the raw price would indicate.

          The state of the art in power and cooling is basically defined by the cost to feed X MW of computing, where that cost includes both capital and operation, and of course lower is better. That means that at a particular SOTA, and at an appropriate scale for that technology, the cost of the facility is a constant overhead on top of the cost of the equipment it houses. To a rough approximation, of course.

    • zozbot234 3 hours ago

      1 GW is not enough, you need at least 1.21 GW before the system begins to learn at a geometric rate and reaches AGI.

djdjsjejb 2 hours ago

thats like boeing telling us we shouldnt build rockets

scroot 9 hours ago

As an elder millennial, I just don't know what to say. That a once in a generation allocation of capital should go towards...whatever this all will be, is certainly tragic given current state of the world and its problems. Can't help but see it as the latest in a lifelong series of baffling high stakes decisions of dubious social benefit that have necessarily global consequences.

  • ayaros 9 hours ago

    I'm a younger millennial. I'm always seeing homeless people in my city and it's an issue that I think about on a daily basis. Couldn't we have spent the money on homeless shelters and food and other things? So many people are in poverty, they can't afford basic necessities. The world is shitty.

    Yes, I know it's all capital from VC firms and investment firms and other private sources, but it's still capital. It should be spent on meeting people's basic human needs, not GPU power.

    Yeah, the world is shitty, and resources aren't allocated ideally. Must it be so?

    • ericmcer 9 hours ago

      The last 10 years has seen CA spend more on homelessness than ever before, and more than any other state by a huge margin. The result of that giant expenditure is the problem is worse than ever.

      I don't want to get deep in the philosophical weeds around human behavior, techno-optimism, etc., but it is a bit reductive to say "why don't we just give homeless people money".

      • mike50 17 minutes ago

        Spending money is not the solution. Spending money in a way that doesn't go to subcontractors is part of the solution. Building shelters beyond cots in a stadium is part of the solution. Building housing is a large part of actually solving the problem. People have tried just giving the money but without a way to convert cash to housing the money doesn't help. Also studies by people smarter then me suggest that without sufficient supply the money ends up going to landlords and pushing up housing costs anyway.

      • trenbologna 8 hours ago

        In CA this issue has to do with Gavin giving that money to his friends who produce very little. Textbook cronyism

      • emodendroket 2 hours ago

        Well I mean, they didn't "just give homeless people money" or just give them homes or any of those things though. I think the issue might be the method and not the very concept of devoting resources to the problem.

      • armitron 8 hours ago

        CA didn't spend money on solving homelessness, they spent money on feeding, sustaining and ultimately growing homelessness. The local politicians and the corrupt bureucratic mechanism that they have created, including the NGOs that a lot of that money is funneled to, have a vested interest in homelessness continuing.

        • _menelaus 8 hours ago

          How do you solve homelessness though? The root of the problem is some people won't take care of themselves. Some homeless just had bad luck, but many are drug addicts, mentally ill, or for whatever other reason just don't function enough to support themselves. I'm skeptical there is a solution you can throw money at.

          • mywittyname 7 hours ago

            Ship them somewhere else, then print a banner saying, "mission accomplished."

            It worked at a state level for years, with certain states bussing their homeless to other states. And recently, the USA has been building up the capability to do the same thing on an international scale.

            That's the "solution" we are going to be throwing money at. Ship them to labor camps propped up by horrible regimes.

          • SequoiaHope 6 hours ago

            A broad social safety net makes a huge difference. It’s not just housing it’s socialized medicine, paid family leave, good transit, free high quality education, solving fewer problems with police and more with social support programs and social workers, free meal programs for adults and children in schools, libraries, and a variety of other programs that help ensure people don’t fall through the cracks here or there. How many people in the US are teetering on the edge of homelessness due to medical debt, and what happens if their partner is in an accident and they lose shared income for rent? Situations like this don’t have a single solution it’s a system of solutions.

            • knowitnone3 3 hours ago

              how broad? you're suggesting give them everything while expecting nothing? I'll be the first in line for my new car.

          • denkmoon 7 hours ago

            Homelessness is solved by having homes. Something we aren’t doing very well.

            • knowitnone3 3 hours ago

              You can start building and giving them away

              • denkmoon 3 hours ago

                I suspect giving them away is a bridge too far, however not rewarding capital for treating them as speculative investment vehicles might be a good start.

          • holsta 7 hours ago

            Many experiments have shown that when you take away people's concerns about money for housing and food, that frees up energy and attention to do other things.

            Like the famous experiment in Finland where homeless people were given cash with no strings attached and most were able to rise out of their despair. The healthcare professionals could then focus their energy on the harder cases. It also saved a bunch of money in the process.

      • Izikiel43 5 hours ago

        WA, specially Seattle, has done the same as CA with the same results.

        They shouldn't just enable them, as a lot of homeless are happy in their situation as long as they get food and drugs, they should force them to get clean and become a responsible adult if they want benefits.

    • SequoiaHope 7 hours ago

      The Sikhs in India run multiple facilities across the country that each can serve 50,000-100,000 free meals a day. It doesn’t even take much in the form of resources, and we could do this in every major city in the US yet we still don’t do it. It’s quite disheartening.

      https://youtu.be/5FWWe2U41N8

    • amluto 9 hours ago

      From what I’ve read, addressing homelessness effectively requires competence more than it requires vast sums of money. Here’s one article:

      https://calmatters.org/housing/2023/06/california-homeless-t...

      Note that Houston’s approach seems to be largely working. It’s not exactly cheap, but the costs are not even in the same ballpark as AI capital expenses. Also, upzoning doesn’t require public funding at all.

      • gowld 8 hours ago

        Houston has less homelessness than California because people at the edge of homelessness prefer to live in California than Houston.

        • amluto 7 hours ago

          I’m not a person on the edge of homelessness, but I did an extremely quick comparison. California cities near the coast have dramatically better weather, but Houston has rents that are so much lower than big California cities that it’s kind of absurd.

          If I had to live outdoors in one of these places, all other thing being equal, I would pick CA for the weather. But if I had trouble affording housing, I think Houston wins by a huge margin.

      • mrguyorama 3 hours ago

        Wasn't houston's "approach" to buy bus tickets to California from a company that just resold commodity bus tickets and was owned by the governors friend and charged 10x market price?

        The governor of Texas bragged about sending 100k homeless people to california (spending about $150 million in the process).

        >in the Golden State, 439 people are homeless for every 100,000 residents – compared to 81 in the Lone Star State.

        If I'm doing my math right, 81 per 100k in a state of 30 million people means 24k homeless people. So the state brags about bussing 100k homeless people to California, and then brags about only having 24k homeless people, and you think it's because they build an extra 100k houses a year?

        The same math for California means that their homeless population is 175k. In other words, Texas is claiming to have more than doubled California's homeless population.

        Maybe the reason Texas can build twice as many homes a year is because it literally has half the population density?

    • GaryBluto 9 hours ago

      > Yes, I know it's all capital from VC firms and investment firms and other private sources, but it's still capital. It should be spent on meeting people's basic human needs, not GPU power.

      It's capital that belongs to people and those people can do what they like with the money they earned.

      So many great scientific breakthroughs that saved tens of millions of lives would never have happened if you had your way.

      • pnut 8 hours ago

        Is that true, that it's money that belongs to people?

        OpenAI isn't spending $1 trillion in hard earned cash on data centres, that is funny money from the ocean of financial liquid slushing around, seeing alpha.

        It also certainly is not a cohort of accredited investors putting their grandchildren's inheritance on the line.

        Misaligned incentives (regulations) both create and perpetuate that situation.

      • saulpw 8 hours ago

        > It's capital that belongs to people and those people can do what they like with the money they earned.

        "earned", that may be the case with millionaires, but it is not the case with billionaires. A person can't "earn" a billion dollars. They steal and cheat and destroy competition illegally.

        I also take issue with the idea that someone can do whatever they want with their money. That is not true. They are not allowed to corner the market on silver, they aren't allowed to bribe politicians, and they aren't allowed to buy sex from underage girls. These are established laws that are obviously for the unalloyed benefit of society as a whole, but the extremely wealthy have been guilty of all of these things, and statements like yours promote the sentiment that allows them to get away with it.

        Finally, "great scientific breakthroughs that saved tens of millions of lives would never have happened if you had your way". No. You might be able to argue that today's advanced computing technology wouldn't have happened without private capital allocation (and that is debatable), but the breakthroughs that saved millions of lives--vaccines, antibiotics, insulin, for example--were not the result of directed private investment.

      • UtopiaPunk 8 hours ago

        "It's capital that belongs to people and those people..."

        That's not a fundamental law of physics. It's how we've decided to arrange our current society, more or less, but it's always up for negotiation. Land used to be understood as a publicly shared resource, but then kings and the nobles decided it belong to them, and they fenced in the commons. The landed gentry became a ruling class because the land "belonged" to them. Then society renegotiated that, and decided that things primarily belonged to the "capitalist" class instead of noblemen.

        Even under capitalism, we understand that that ownership is a little squishy. We have taxes. The rich understandably do not like taxes because it reduces their wealth (and Ayn Rand-styled libertarians also do not like taxes of any kind, but they are beyond understanding except to their own kind).

        As a counterpoint, I and many others believe that one person or one corporation cannot generate massive amounts of wealth all by themselves. What does it mean to "earn" 10 billion dollars? Does such a person work thousdands of time harder or smarter than, say, a plumber or a school teacher? Of course not. They make money because they have money: they hire workers to make things for them that lead to profit, and they pay the workers less than the profit that is earned. Or they rent something that they own. Or they invest that money in something that is expected to earn them a higher return. In any scenario, how is it possible to earn that profit? They do so because they participate in a larger society. Workers are educated in schools, which the employer probably does not pay for in full. Customers and employees travel on infrastructure, maintained by towns and state governments. People live in houses which are built and managed by other parties. The rich are only able to grow wealth because they exist in a larger society. I would argue that it is not only fair, but crucial, that they pay back into the community.

        • klaff 8 hours ago

          Well said. I would add that corporations exist because we choose to let them, to let investors pool capital and limit risk, and in exchange society should benefit, and if it doesn't we should rearrange that deal.

      • mrguyorama 2 hours ago

        Please tell me which of Penicillin, insulin, the transistor, the discovery and analysis of the electric field, discovery of DNA, invention of mRNA vaccines, discovery of pottery, basket weaving, discovery of radiation, the recognition that citrus fruit or vitamin C prevents and cures scurvy (which we discovered like ten times), the process for creating artificial fertilizers, the creation of steel, domestication of beasts of burden, etc were done through Wealthy Barons or other capital holders funding them.

        Many of the above were discovered by people explicitly rejecting profit as an outcome. Most of the above predate modern capitalism. Several were explicitly government funded.

        Do you have a single example of a scientific breakthrough that saved tens of millions of lives that was done by capital owners?

        • mike50 9 minutes ago

          The transistor was funded by Bell Labs.

    • AstroBen 3 hours ago

      > Couldn't we have spent the money on homeless shelters and food and other things

      I suspect this is a much more complicated issue than just giving them food and shelter. Can money even solve it?

      How would you allocate money to end obesity, for instance? It's primarily a behavioral issue, a cultural issue

    • IAmGraydon 6 hours ago

      The older I get, the more I realize that our choices in life come down to two options: benefit me or benefit others. The first one leads to nearly every trouble we have in the world. The second nearly always leads to happiness, whether directly or indirectly. Our bias as humans has always been toward the first, but our evolution is and will continue to slowly bring us toward the second option. Beyond simple reproduction, this realization is our purpose, in my opinion.

    • dkural 9 hours ago

      [ This comment I'm making is USA centric. ]. I agree with the idea of making our society better and more equitable - reducing homelessness, hunger, poverty, especially for our children. However, I think redirecting this to AI datacenter spending is a red-herring, here's why I think this: As a society we give a significant portion of our surplus to government. We then vote on what the government should spend this on. AI datacenter spending is massive, but if you add it all up, it doesn't cover half of a years worth of government spending. We need to change our politics to redirect taxation and spending to achieve a better society. Having a private healthcare system that spends twice the amount for the poorest results in the developed world is a policy choice. Spending more than the rest of the world combined on the military is a policy choice. Not increasing minimum wage so at least everyone with a full time job can afford a home is a policy job (google "working homelessness). VC is a teeny tiny part of the economy. All of tech is only about 6% of the global economy.

      • limagnolia 7 hours ago

        You can increase min wage all you want, if there aren't enough homes in an area for everyone who works full time in that area to have one, you will still have folks who work full time who don't have one. In fact, increasing min wage too much will exacerbate the problem by making it more expensive to build more (and maintain those that exist). Though at some point, it will fix the problem too, because everyone will move and then there will be plenty of homes for anyone who wants one.

        • dkural 6 hours ago

          I agree with you 100%! Any additional surplus will be extracted as rents, when housing is restricted. I am for passing laws that make it much easier for people to obtain permits to build housing where there is demand. Too much of residential zoning is single-family housing. Texas does a better job at not restricting housing than California, for example. Many towns vote blue, talk to talk, but do not walk the walk.

      • jkubicek 8 hours ago

        > AI datacenter spending is massive, but if you add it all up, it doesn't cover half of a years worth of government spending.

        I didn't check your math here, but if that's true, AI datacenter spending is a few orders of magnitude larger than I assumed. "massive" doesn't even begin to describe it

        • dkural 6 hours ago

          Global datacenter spending across all categories (ML + everything else) is roughly 0.9 - 1.2 trillion dollars for the last three years combined, I was initially going to go for "quarter of the federal budget", but picked something I thought was more conservative to account for announced spending and 2025 etc. I pick 2022 onward for the LLM wave. In reality, solely ML driven, actual realized-to-date spending is probably about 5% of the federal budget. The big announcements will spread out over the next several years in build-out. Nonetheless, it's large enough to drive GDP growth a meaningful amount. Not large enough that redirecting it elsewhere will solve our societal problems.

        • atmavatar 7 hours ago

          The US federal budget in 2024 had outlays of 6.8 trillion dollars [1].

          nVidia's current market cap (nearly all AI investment) is currently 4.4 trillion dollars [2][3].

          While that's hardly an exact or exhaustive accounting of AI spending, I believe it does demonstrate that AI investment is clearly in the same order of magnitude as government spending, and it wouldn't surprise me if it's actually surpassed government spending for a full year, let alone half of one.

          1. https://www.cbo.gov/publication/61181

          2. https://www.google.com/finance/quote/NVDA:NASDAQ

          3. https://www.cnbc.com/2025/09/30/nvidias-market-cap-tops-4poi...

          • diziet 5 hours ago

            > NVIDIA's total annual revenue for its fiscal year 2025 (ended January 26, 2025) was $130.5 billion

            It is clearly not in the same order of magnitude

      • kipchak 7 hours ago

        >We need to change our politics to redirect taxation and spending to achieve a better society.

        Unfortunately, I'm not sure there's much on the pie chart to redirect percentage wise. About 60% goes to non-discretionary programs like Social Security and Medicaid, and 13% is interest expense. While "non-discretionary" programs can potentially be cut, doing so is politically toxic and arguably counter to the goal of a better society.

        Of the remaining discretionary portion half is programs like veterans benefits, transportation, education, income security and health (in order of size), and half military.

        FY2025 spending in total was 3% over FY2024, with interest expense, social security and medicare having made up most of the increase ($249 billion)[1], and likely will for the foreseeable future[2] in part due to how many baby boomers are entering retirement years.

        Assuming you cut military spending in half you'd free up only about 6% of federal spending. Moving the needle more than this requires either cutting programs and benefits, improving efficiency of existing spend (like for healthcare) or raising more revenue via taxes or inflation. All of this is potentially possible, but the path of least resistance is probably inflation.

        [1] https://bipartisanpolicy.org/report/deficit-tracker/

        [2] https://www.crfb.org/blogs/interest-social-security-and-heal...

        • dkural 5 hours ago

          I agree with all of what you're saying.

          I think the biggest lever is completely overhauling healthcare. The USA is very inefficient, and for subpar outcomes. In practice, the federal government already pays for the neediest of patients - the elderly, the at-risk children, the poor, and veterans. Whereas insurance rakes in profits from the healthiest working age people. Given aging, and the impossibility of growing faster than the GDP forever, we'll have to deal with this sooner or later. Drug spending, often the boogeyman, is less than 7% of the overall healthcare budget.

          There is massive waste in our military spending due to the pork-barrel nature of many contracts. That'd be second big bucket I'd reform.

          I think you're also right that inflation will ultimately take care of the budget deficit. The trick is to avoid hyperinflation and punitive interest rates that usually come along for the ride.

          I would also encourage migration of highly skilled workers to help pay for an aging population of boomers. Let's increase our taxpayer base!

          I am for higher rates of taxation on capital gains over $1.5M or so, that'll also help avoid a stock market bubble to some extent. One can close various loopholes while at it.

          I am mostly arguing for policy changes to redistribute more equitably. I would make the "charity" status of college commensurate with the amount of financial aid given to students and the absolute cost of tuition for example., for example. I am against student loan forgiveness for various reasons - it's out of topic for this thread but happy to expand if interested.

    • nine_zeros 9 hours ago

      > but it's still capital. It should be spent on meeting people's basic human needs, not GPU power.

      What you have just described is people wanting investment in common society - you see the return on this investment but ultra-capitalistic individuals don't see any returns on this investment because it doesn't benefit them.

      In other words, you just asked for higher taxes on the rich that your elected officials could use for your desired investment. And the rich don't want that which is why they spend on lobbying.

    • newfriend 9 hours ago

      Technological advancement is what has pulled billions of people out of poverty.

      Giving handouts to layabouts isn't an ideal allocation of resources if we want to progress as a civilization.

      • QuercusMax 8 hours ago

        Lots of people lose their housing when they lose employment, and then they're stuck and can't get back into housing. A very large percentage of unhoused people are working jobs; they're not all "layabouts".

        We know that just straight up giving money to the poorest of the poor results in positive outcomes.

        • limagnolia 6 hours ago

          "A very large percentage"

          Exactly how large are we talking here?

          I have known quite a few 'unhoused' folk, and not many that had jobs. Those that do tend to find housing pretty quickly (Granted, my part of the country is probably different from your part, but I am interested in stats from any region).

      • nativeit 9 hours ago

        The proportion of people you write off as “layabouts” is always conveniently ambiguous…of the number of unemployed/underemployed, how many are you suggesting are simply too lazy to work for a living?

      • estearum 7 hours ago

        Technological advancements and cultural advancements that spread the benefits more broadly than naturally occurs in an industrialized economy. That is what pulled people out of poverty.

        If you want to see what unfettered technological advancement does, you can read stories from the Gilded Age.

        The cotton gin dramatically increased human enslavement.

        The sewing machine decreased quality of life for seamstresses.

        > During the shirtmakers' strike, one of the shirtmakers testified that she worked eleven hours in the shop and four at home, and had never in the best of times made over six dollars a week. Another stated that she worked from 4 o’clock in the morning to 11 at night. These girls had to find their own thread and pay for their own machines out of their wages.

        These were children, by the way. Living perpetually at the brink of starvation from the day they were born until the day they died, but working like dogs all the while.

      • johnrob 9 hours ago

        Invest in making food/shelter cheaper?

        • dotancohen 9 hours ago

          Food and shelter are cheaper than at almost any time in human history. Additionally, people have more variety of healthy foods all year long.

          No matter how cheap food and shelter are, there will always be people who can not acquire them. Halting all human progress until the last human is fed and sheltered is a recipe for stagnation. Other cultures handle this with strong family bonds - those few who can not acquire food or shelter for whatever reason are generally provided for by their families.

          • estearum 7 hours ago

            The US has built its physical infrastructure to make familial interdependence extremely difficult and often impossible.

            Too monotonous housing mixes over too large of areas.

            • dotancohen 7 hours ago

              Large houses make familial interdependence extremely difficult? That doesn't make sense. Or did I misunderstood? I don't live in the US.

              • estearum 7 hours ago

                Most people don't have houses large enough to house multiple generations inside the house. Houses are sized for parents + kids. And those are the only dwelling units available or legally allowed for vast distances in any direction.

          • johnrob 8 hours ago

            Cheap depends on how we define the cost. In relative terms, food is more expensive than ever:

            https://en.wikipedia.org/wiki/Baumol_effect

            • bryanlarsen 7 hours ago

              Food is not Baumol, productivity increases is how we went from 80% of the population working in primary food production to 1%. These increases have not stopped.

      • LightBug1 8 hours ago

        It's not unthinkable that one of those "layabouts" could have been the next Steve Jobs under different circumstances ...

        People are our first, best resource. Closely followed by technology. You've lost sight of that.

      • sfink 7 hours ago

        > Technological advancement is what has pulled billions of people out of poverty.

        I agree with this. Perhaps that's what is driving the current billionaire class to say "never again!" and making sure that they capture all the value instead of letting any of it slip away and make it into the unwashed undeserving hands of lesser beings.

        Chatbots actually can bring a lot of benefit to society at large. As in, they have the raw capability to. (I can't speak to whether it's worth the cost.) But that's not going to improve poverty this time around, because it's magnifying the disparities in wealth distribution and the haves aren't showing any brand new willingness to give anything up in order to even things out.

        > Giving handouts to layabouts isn't an ideal allocation of resources if we want to progress as a civilization.

        I agree with this too. Neither is giving handouts to billionaires (or the not quite as eye-wateringly wealthy class). However, giving handouts to struggling people who will improve their circumstances is a very good allocation of resources if we want to progress as a civilization. We haven't figured out any foolproof way of ensuring such money doesn't fall into the hands of layabouts or billionaires, but that's not an adequate reason to not do it at all. Perfect is the enemy of the good.

        Some of those "layabouts" physically cannot do anything with it other than spending it on drugs, and that's an example of a set of people who we should endeavor to not give handouts to. (At least, not ones that can be easily exchanged for drugs.) Some of those billionaires similarly have no mental ability of ever using that money in a way that benefits anyone. (Including themselves; they're past the point that the numbers in their bank accounts have any effect on their lives.) That hasn't seemed to stop us from allowing things to continue in a way that funnels massive quantities of money to them.

        It is a choice. If people en masse were really and truly bothered by this, we have more than enough mechanisms to change things. Those mechanisms are being rapidly dismantled, but we are nowhere near the point where figurative pitchforks and torches are ineffective.

      • droopyEyelids 9 hours ago

        What if some of the homeless people are children or people who could lead normal lives but found themselves in dire circumstances?

        Some of us believe that keeping children out of poverty may be an investment in the human capital of a country.

        • dkural 9 hours ago

          Anthropologists measure how civilized a tribe or society was by looking if they took care of the elderly, and what the child survival rates were. USA leads to developed world in child poverty, child homelessness, and highest rate of child death due to violence. Conservatives often bring up the statistic by race. It turns out bringing people over as slaves, and after freedom, refusing to provide land, education, fair access to voting rights, or to housing (by redlining etc.) - all policies advocated by conservatives of time past, was not the smartest thing to do. Our failure as a civilized society began and is in large part a consequence of the original sin of the USA.

          • estearum 7 hours ago

            Yep

            > purposely create underclass

            > wait

            > act surprised that underclass exists

        • newfriend 9 hours ago

          The US already provides significant aid to those in poverty, especially children. We don't need to stifle innovation to reach some level of aid that bleeding hearts deem sufficient.

          • QuercusMax 8 hours ago

            Do you really think that building giant datacenters full of accelerators that will never be used is "innovation"?

            • _DeadFred_ 8 hours ago

              We need excess capacity for when the next 'rip off anime artist XYZ' fad hits. If we didn't do that, we would be failing capitalism and all the people of history who contributed to our technological progress.

      • _DeadFred_ 8 hours ago

        In the USA cowboys were homeless guys. You know that right? Like they had no home, slept outside. Many were pretty big layabouts. Yet they are pretty big part of our foundation myth and we don't say 'man they just should have died'.

        Can I go be a cowboy? Can I just go sleep outside? maybe work a few minimal paying cattle run jobs a year? No? If society won't allow me to just exist outside, then society has an obligation to make sure I have a place to lay my head.

    • UtopiaPunk 8 hours ago

      I don't think it is a coincidence that the areas with the wealhiest people/corporations are the same areas with the most extreme poverty. The details are, of course, complicated, but zooming way way out, the rich literally drain wealth from those around them.

  • reactordev 9 hours ago

    I threw in the towel in April.

    It's clear we are Wile E. Coyote running in the air already past the cliff and we haven't fallen yet.

    • saulpw 8 hours ago

      What does it mean to throw in the towel, in your case? Divesting from the stock market? Moving to a hobby farm? Giving up on humanity?

      • reactordev 6 hours ago

        Any dream of owning a home, having retirement, even a career after a couple years when it’s clear I’m over the hump. I’m trying to squeeze as much as I can before that happens and squirrel it away so at least I can have a van down by a river.

        • ohhnoodont an hour ago

          What does squirreling it away mean though? A pile of cash instead of investments? The reality is that you don’t get to throw in the towel.

  • skippyboxedhero 2 hours ago

    Can you imagine if the US wasn't so unbelievably far ahead of everyone else?

    I am sure the goat herders in rural regions of Pakistan will think themselves lucky when they see the terrible sight of shareholder value being wantonly destroyed by speculative investments that enhance the long-term capital base of the US economy. What an uncivilized society.

  • slashdave 4 hours ago

    Well, at least this doesn't involve death and suffering, like the old-fashioned way to jump-start an economy by starting a global war.

  • PrairieFire 9 hours ago

    agree the capital could be put to better use, however I believe the alternative is this capital wouldn't have otherwise been put to work in ways that allow it to leak to the populace at large. for some of the big investors in AI infrastructure, this is cash that was previously and likely would have otherwise been put toward stock buybacks. for many of the big investors pumping cash in, these are funds deploying the wealth of the mega rich, that again, otherwise would have been deployed in other ways that wouldn't leach down to the many that are yielding it via this AI infrastructure boom (datacenter materials, land acquisition, energy infrastructure, building trades, etc, etc)

    • amanaplanacanal 9 hours ago

      It could have, though. Higher taxes on the rich, spend it on social programs.

      • ayaros 9 hours ago

        Why is this so horrible. Put more resources in the hands of the average person. They will get pumped right back into the economy. If people have money to spend, they can buy more things, including goods and services from gigantic tax-dodging mega-corporations.

        Gigantic mega-corporations do enjoy increased growth and higher sales, don't they? Or am I mistaken?

        • 542354234235 8 hours ago

          The shift in the US to the idea of “job creators” being business owners is part of it. It was just a way to direct money to the already rich, as if they would hire more people with that money. When it is plainly obvious that consumers are job creators, in that if they buy more goods and services, businesses will hire more people to make or provide more of those things.

          Or maybe it was trickle down economics. Trickle up economics still end up with the rich getting the money since we all buy things from companies they own, it just goes through everyone else first. Trickle down cuts out the middleman, which unfortunately is all of us.

          • panick21_ 7 hours ago

            The framing of X or Y are job creators is idiotic. Its literally the most basic fact of economics that you need producers and consumers, otherwise you don't have an economy.

            The more economically correct way to express this would be that entrepreneurs and companies who innovated increase productivity and that makes the overall economy more efficient allowing your country to grow.

            > Or maybe it was trickle down economics. Trickle up economics still end up with the rich getting the money since we all buy things from companies they own, it just goes through everyone else first. Trickle down cuts out the middleman, which unfortunately is all of us.

            This just sounds like quarter baked economics ideas you have made up yourself. Neither 'trickle down' nor 'trickle up' are concepts economist use. And that you confidently assert anything about the social outcomes of these 'concepts' is ridiculous.

        • thmsths 9 hours ago

          Because the entire western culture has shifted to instant gratification. Yes, what you suggest would most likely lead to increased business eventually. But they want better number this quarter, so they resort to the cheap tricks like financial engineering/layoffs to get an immediate boost.

        • coliveira 8 hours ago

          Because government is always a fight about resources. More resources in the hands of common people and to make their lives better is less money in the hands of powerful corporations and individuals, be it in the form of higher taxes for the rich or less direct money going to their enterprises.

          • QuercusMax 8 hours ago

            One of the big issues is money in politics. Our congresspeople make a killing off of legal insider trading, they take huge donations from companies, and the supreme court has even said that it's cool to give "gratuities" in exchange for legislation or court rulings you like.

            Corruption is killing this country.

        • panick21_ 9 hours ago

          I'm not saying you are wrong that some redistribution can be good, but your analysis is simplistic and ignores many factors. You can just redistribute and then say 'well people will spend the money'. That's literally the 'Broken Window' fallacy from economics. You are ignoring that if you don't redistribute it, money also gets spend, just differently. Also, the central bank is targeting AD, so you're not actually increasing nominal income by redistributing.

          • coliveira 8 hours ago

            There are many ways of spending money in the population that don't include just "distribution of money", as it's portrayed nowadays. Child care, free and high quality schools, free transportation, free or subsidized healthcare, investment is labor-intensive industries, these are all examples of expenditures that translate in better quality of life and also improve competitiveness for the country.

            • panick21_ 7 hours ago

              That has literally nothing to do with the point I have argued.

          • 542354234235 8 hours ago

            Take a million dollars, give 1,000 poor people $1,000 and every dollar will be spent on goods and services. The companies running those services and making those goods will need to have their employees work more hours, putting more money back in poor people’s pockets in addition to the money the companies make. Those employees have a few extra dollars to spend on goods and services, etc.

            Give a rich person a million dollars, and they will put it in an offshore tax shelter. That’s not exactly driving economic activity.

            • panick21_ 7 hours ago

              You are simply disagreeing with 99% of economists.

              Money in tax shelter doesn't go threw a portal in another universe. Its either invested or saved as some kind of asset and in that form is in circulation. And again, even if you assume it increases monetary demand (decreases velocity) the central bank targets AD and balances that out.

              Based on your logic, a country that taxes 100% of all income and redistrubtes it would become infinity rich. Your logic is basically 'if nobody saves and everybody spends all income' everybody will be better off.

              This is not how the economy works even if it feels good to think that. Its a fallacy.

              Where you could have a point is that potentially the tax impact is slightly different, but that's hard to prove.

          • QuercusMax 8 hours ago

            Stock buybacks don't build anything. They're just a way to take money from inside a company and give it to the shareholders.

            • panick21_ 7 hours ago

              I don't know what that has to do with the point discussed.

              Do you think shareholder don't spend money, but employees do or something?

      • phkahler 8 hours ago

        Let's pay down the debt before increasing social programs. You know, save the country first. If a penny saved is a penny earned then everyone -rich or poor- is looking for a handout.

        • amanaplanacanal 5 hours ago

          The only person who has come close to balancing the federal budget was Clinton. But Republicans still try to position themselves as the party of fiscal responsibility.

          If the voters can't even figure out why the debt keeps going up, I think you are fighting a losing battle.

    • Atheros 7 hours ago

      > likely would have otherwise been put toward stock buybacks

      Stock buybacks from who? When stock gets bought the money doesn't disappear into thin air; the same cash is now in someone else's hands. Those people would then want to invest it in something and then we're back to square one.

      You assert that if not for AI, wealth wouldn't have been spent on materials, land, trades, ect. But I don't think you have any reason to think this. Money is just an abstraction. People would have necessarily done something with their land, labor, and skills. It isn't like there isn't unmet demand for things like houses or train tunnels or new-fangled types of aircraft or countless other things. Instead it's being spent on GPUs.

      • PrairieFire 7 hours ago

        Totally agree that the money doesn’t vanish. My point isn’t “buybacks literally destroy capital,” it’s about how that capital tends to get redeployed and by whom.

        Buybacks concentrate cash in the hands of existing shareholders, which are already disproportionately wealthy and already heavily allocated to financial assets. A big chunk of that cash just gets recycled into more financial claims (index funds, private equity, secondary shares, etc), not into large, lumpy, real world capex that employs a bunch of electricians, heavy equipment operators, lineworkers, land surveyors, etc. AI infra does that. Even if the ultimate economic owner is the same class of people, the path the money takes is different: it has to go through chip fabs, power projects, network buildouts, construction crews, land acquisition, permitting, and so on. That’s the “leakage” I was pointing at.

        To be more precise: I’m not claiming “no one would ever build anything else”, I’m saying given the current incentive structure, the realistic counterfactual for a lot of this megacap tech cash is more financialization (buybacks, M&A, sitting on balance sheets) rather than “let’s go fund housing, transit tunnels, or new aircraft.”

  • jstummbillig 8 hours ago

    I don't know what to do with this take.

    We need an order of magnitude more clean productivity in the world so that everyone can live a life that is at least as good as what fairly normal people in the west currently enjoy.

    Anyone who think this can be fixed with current Musk money is simply not getting it: If we liquidated all of that, that would buy a dinner for everyone in the world (and then, of course, that would be it, because the companies that he owns would stop functioning).

    We are simply, obviously, not good enough at producing stuff in a sustainable way (or: at all) and we owe it to every human being alive to take every chance to make this happen QUICKLY, because we are paying with extremely shitty humans years, and they are not ours.

    Bring on the AI, and let's make it work for everyone – and, believe me, if this is not to be to the benefit of roughly everyone, I am ready to fuck shit up. But if the past is any indication, we are okay at improving the lives of everyone when productivity increases. I don't know why this time would be any different.

    If the way to make good lives for all 8 billions of us must lead to more Musks because, apparently, we are too dumb to do collectivization in any sensible way, I really don't care.

  • anthomtb 8 hours ago

    As a fellow elder millennial I agree with your sentiment.

    But I don't see the mechanics of how it would work. Rewind to October 2022. How, exactly, does the money* invested in AI since that time get redirected towards whatever issues you find more pressing?

    *I have some doubts about the headline numbers

  • arisAlexis 9 hours ago

    Yes this capital allocation is a once in a lifetime opportunity to crate AGI that will solve diseases and poverty.

    • edhelas 9 hours ago

      </sarcasm>

      • arisAlexis 6 hours ago

        This is literally the view of demis hassabis, Sergey brin, Mario amodei and others. Are you seriously implying they are trolling us?

        • pezezin an hour ago

          Poverty is a social and political issue, not technological. We have more than enough resources on this planet to fix it, but we don't.

  • NedF 7 hours ago

    [dead]

Animats 9 hours ago

How much has actually been spent on AI data centers vs. amounts committed or talked about? That is, if construction slows down sharply, what's total spend?

pharos92 3 hours ago

I find it disturbing how long people wait to accept basic truths, as if they need permission to think or believe a particular outcome will occur.

It was quite obvious that AI was hype from the get-go. An expensive solution looking for a problem.

The cost of hardware. The impact on hardware and supply chains. The impact to electricity prices and the need to scale up grid and generation capacity. The overall cost to society and impact on the economy. And that's without considering the basic philosophical questions "what is cognition?" and "do we understand the preconditions for it?"

All I know is that the consumer and general voting population loose no matter the outcome. The oligarchs, banking, government and tech-lords will be protected. We will pay the price whether it succeeds or fails.

My personal experience of AI has been poor. Hallucinations, huge inconsistencies in results.

If your day job exists within an arbitrary non-productive linguistic domain, great tool. Image and video generation? Meh. Statistical and data-set analysis. Average.

  • wordpad 3 hours ago

    Just like .com bust from companies going online, there is hype, but there is also real value.

    Even slow non-tech legacy industry companies are deploying chatbots across every department - HR, operations, IT, customer support. All leadership are already planning to cut 50 - 90% of staff from most departments over next decade. It matters, because these initiatives are receiving internal funding which will precipitate out to AI companies to deploy this tech and to scale it.

    • SchemaLoad 3 hours ago

      The "legacy" industry companies are not immune from hype. Some of those AI initiatives will provide some value, but most of them seem like complete flops. Trying to deploy a solution without an idea of what the problem or product is yet.

cmrdporcupine 29 minutes ago

The investors in these companies and all this infrastructure are not so much concerned with whether any specific companies pays off with profits, necessarily.

They are gambling instead that these investments pay out it in a different way: by shattering high labour costs for intellectual labour and de-skilling our profession (and others like it) -- "proletarianising" in the 19th century sense.

Thereby increasing profits across the board and breaking the bargaining power (and outsized political power, as well) of upper middle class technology workers.

Put another way this is an economy wide investment in a manner similar to early 20th century mass factory industrialization. It's not expected that today's big investments are tomorrow's winners, but nobody wants to be left behind in the transformation, and lots of political and economic power is highly interested in the idea of automating away the remnants of the Alvin Toffler "Information Economy" fantasy.

bytesandbits 2 hours ago

Mind you IBM makes +7B from keeping old school enterprise hooked up on 30 plus year old tech like z/OS and Cobol and their own super outdated stack. their AI division is frankly embarrassing. of course they would say that. IBM is one of the most conservative, anti-progress leaches in the entire tech industry. I am glad they are missing out big time on the AI gold rush. to me if anything this is a green signal.

nashashmi 8 hours ago

Don’t worry. The same servers will be used for other computing purposes. And maybe that will be profitable. Maybe it will be beneficial to others. But This cycle of investment and loss is a version of distribution of wealth. Some benefit.

The banks and loaners always benefit.

  • coliveira 8 hours ago

    That would be true for general purpose servers. But what they want is lots of special purpose AI chips. While is still possible to use that for something else, it's very different from having a generic server farm.

    • danans 3 hours ago

      LAN party at the end of the universe?

    • scotty79 2 hours ago

      I can't imagine everybody suddenly leaving AI like a broken toy and taking all special purpose AI chips offline. AI serves millions of people every day. It's here to stay even if it doesn't get any better than it is it already brings immense value to the users. It will keep being worth something.

eitally 9 hours ago

At some point, I wonder if any of the big guys have considered becoming grid operators. The vision Google had for community fiber (Google Fiber, which mostly fizzled out due to regulatory hurdles) could be somewhat paralleled with the idea of operating a regional electrical grid.

maxglute 9 hours ago

How long can ai gpus stretch? Optmistic 10 years and we're still looking at 400b+ profit to cover interests. The factor in silicon is closer to tulips than rail or fiber in terms of depreciated assets.

rmoriz 2 hours ago

The second buyer will make truckloads of money, remember the data center and fiber network liquidation of 2001+ - smart investors collected the overcapacity and after a couple of years the money printer worked. This time it will be the same, only the single purpose hardware (LLM specific GPUs) will probably end on a landfill.

  • nialse 2 hours ago

    The game is getting OpenAI to owe you as much money as you can. When they fail to pay back, you own OpenAI.

m3kw9 an hour ago

Says the guy missing out on it

bluGill 9 hours ago

This is likely correct overall, but it can still pay off in specific cases. However those are not blind investments they are targeted with a planned business model

matt-p 2 hours ago

Unless we get AGI.

  • BoorishBears 2 hours ago

    Consumer will eat it all. AI is very good at engaging content, and getting better by the day: it won't be the AGI we wanted, but maybe the AGI we've earned

wmf 9 hours ago

$8T may be too big of an estimate. Sure you can take OpenAI's $1.4T and multiply it by N but the other labs do not spend as much as OpenAI.

Ekaros 8 hours ago

How much of Nvidias price is based on 5 year replacement cycle? If that stops or slows with new demand could it also affect things? Not that 5 years does not seem very long horizon now.

matt_s 6 hours ago

There is something to be said about what the ROI is for normal (i.e. non AI/tech) companies using AI. AI can help automate things, robots have been replacing manufacturing jobs for decades but there is an ROI on that which I think is easier to see and count, less humans in the factory, etc. There seems to be a lot of exaggerated things being said these days with AI and the AI companies have only begun to raise rates, they won't go down.

The AI bubble will burst when normal companies start to not realize their revenue/profit goals and have to answer investor relations calls about that.

parapatelsukh 9 hours ago

The spending will be more than paid off since the taxpayer is the lender of last resort There's too many funny names in the investors / creditors a lot of mountains in germany and similar ya know

ninjaa 4 hours ago

What does Jim Cramer have to say?

jmclnx 9 hours ago

I guess he is looking directly at IBM's cash cow, the mainframe business.

But, I think he is correct, we will see. I still believe AI will not give the CEOs what they really want, no or very cheap labor.

qwertyuiop_ 9 hours ago

The question no one seems to be answering is what would be the EOL for these newer GPUs that are being churned out of NVDIA ? What % annual capital expenditures is refresh of GPUs. Will they be perpetually replaced as NVIDIA comes up with newer architectures and the AI companies chase the proverbial lure ?

  • scotty79 2 hours ago

    I think the key to replacing is power efficiency. If Nvidia is not able to make GPUs that are cheaper to run than previous generation theres no point for replacing previous generation. Time doesn't matter.

devmor 9 hours ago

I suppose it depends on your definition of "pay off".

It will pay off for the people investing in it, when the US government inevitably bails them out. There is a reason Zuckerberg, Huang, etc are so keen on attending White House dinners.

It certainly wont pay off for the American public.

oxqbldpxo 9 hours ago

FB playbook. Act (spend) then say sorry.

sombragris 4 hours ago

"yeah, there's no way spending in those data centers will pay off. However, let me show you this little trinket which runs z/OS and which is exactly what you need for these kinds of workloads. You can subscribe to it for the low introductory price of..."

verdverm 10 hours ago

IBM CEO is steering a broken ship and it's not improved course, not someone who's words you should take seriously.

1. The missed the AI wave (hired me to teach watson law only to lay me off 5 wks later, one cause of the serious talent issues over there)

2. They bought most of their data center (companies), they have no idea about building and operating one, not at the scale the "competitors" are operating at

  • nabla9 10 hours ago

    Everyone should read his argument carefully. Ponder them in silence and accept or reject them in based on the strength of the arguments.

    • scarmig 9 hours ago

      His argument follows almost directly, and trivially, from his central premise: a 0% or 1% chance of reaching AGI.

      Yeah, if you assume technology will stagnate over the next decade and AGI is essentially impossible, these investments will not be profitable. Sam Altman himself wouldn't dispute that. But it's a controversial premise, and one that there's no particular reason to think that the... CEO of IBM would have any insight into.

      • skeeter2020 9 hours ago

        then it seems like neither Sam Altman (pro) or IBM (proxy con) have credible or even really interesting or insightful evidence, theories ... even suggestions for what's likely to happen? i.e. We should stop listening to all of them?

        • scarmig 9 hours ago

          Agreed. It's essentially a giant gamble with a big payoff, and they're both talking their books.

      • PunchyHamster 3 hours ago

        It's a very reasonable claim to make, but yes, average denizen of peanut gallery can spot this is a bubble from a mile way, doensn't need "insight" of napkin math done by some CEO that's not even in the industry.

        Tho he's probably not too happy that they sold the server business to Lenovo, could at least earn something on selling shovels

      • verdverm 8 hours ago

        we don't need AGI to use all that compute

        we need businesses who are willing to pay for ai / compute at prices where both sides are making money

        I for one could 10x my AI usage if the results on my side pan out. Spending $100 on ai today has ROI, will 10x that still have ROI for me in a couple years? probably, I expect agentic teams to increase in capability and more of my work. Then the question is can I turn that increase productivity into more revenues (>$1000 / month, one more client would cover this and then some)

    • nyc_data_geek1 9 hours ago

      IBM can be a hot mess, and the CEO may not be wrong about this. These things are not mutually exclusive.

  • duxup 9 hours ago

    Is his math wrong?

    • verdverm 8 hours ago

      Are the numbers he's claiming accurate? They seem like big numbers pulled out of the air, certainly much large than the numbers we've actually seen committed to (not even deployed yet).

  • malux85 10 hours ago

    Sorry that happened to you, I have been there too,

    When a company is hiring and laying off like that it’s a serious red flag, the one that did that to me is dead now

    • verdverm 8 hours ago

      It was nearly 10 years ago and changed the course of my career for the better

      make lemonade as they say!

  • observationist 9 hours ago

    IBM CEO has sour grapes.

    IBM's HPC products were enterprise oriented slop products banked on their reputation, and the ROI torched their credibility when compute costs started getting taken seriously. Watson and other products got smeared into kafkaesque arbitrary branding for other product suites, and they were nearly all painful garbage - mobile device management standing out as a particularly grotesque system to use. Now, IBM lacks any legitimate competitive edge in any of the bajillion markets they tried to target, no credibility in any of their former flagship domains, and nearly every one of their products is hot garbage that costs too much, often by orders of magnitude, compared to similar functionality you can get from things like open source or even free software offered and serviced by other companies. They blew a ton of money on HPC before there was any legitimate reason to do so. Watson on Jeopardy was probably the last legitimately impressive thing they did, and all of their tech and expertise has been outclassed since.

bmadduma 3 hours ago

No wonder why he is saying that, they lost AI game, no top researcher wants to work for IBM. Spent years developing Watson, it is dead. I believe this is a company that should not be existed.

  • BearOso 3 hours ago

    Maybe it's the opposite. IBM spent years on the technology. Watson used neural networks, just not nearly as large. Perhaps they foresaw that it wouldn't scale or that it would plateau.