Why AI is a house of cards: 1. You pay $200 a year for an AI app (like Cursor). 2. Cursor pays OpenAI $500 for API tokens ($300 of which is VC funding). 3. OpenAI pays AWS $1000 for compute ($500 of which is VC funding). 4. $AWS pays $10k for $nvda GPUs. See the problem?
Unless you as a user are miraculously comfortable paying $1k for the AI app The only thing propping up AI is VC funding No VC funding: The AI application layer is unprofitable The LLM layer is unprofitable The compute layer is unprofitable The GPu layer is unsustainable
The bulls say that inference cost is dropping exponentially (it is) Which would mean the cost of compute to LLMs like OpenAI (and by extension to the AI apps like cursor) drops too But this must happen BEFORE they run out of VC funding And hinges on how much users will pay
If the end users are not able to pay the breakeven price for this entire stack: Eg: cost of $nvda GPU depreciated over the lifecycle Hyperscaler cloud margins LLM inference margins AI wrapper margins Which could literally total to $1k per user Then this whole game implodes
Right now, people are paying $20 a month for mid tier and $200 a month for pro tier plans eg Cursor That’s $240 a year // $2400 a year I’d guess that the high usage user probably burns $10k+ worth of compute So unless you can 10x the cost to these ppl This isn’t sustainable
I see only 3 viable pathways for these companies: 1. LLMs need to cut model/inference costs drastically (by gating usage to AI wrappers) 2. AI application companies need to start charging WAY more across the board (hard 2 do rn bc land grab in play) https://x.com/zoomyzoomm/statu...
3. GPU prices come down substantially (e.g. $NVDA competitor emerges) Which would alleviate a LOT of the cost downstream, e.g: 1. The hyperscalers can cut prices 2. LLMs can cut API prices 3. Ai application co's can run breakeven on current pricing
Ultimately, the "black swan" I see is that deflation is coming for everything in the technology stack: 1. GPU costs falling (due to competitors emerging) 2. Cloud costs falling (due to neoclouds and hyperscaler competition) 3. Inference costs falling (due to better hardware)
All this means is that the unit economics will EVENTUALLY improve But WHEN this happens And WHAT margin profile each of these players has when all is washed out Is yet to be termined If I had to guess: Winners: GPU providers and Hyperscalers Losers: LLMs and AI App co's
This is why you're seeing the entire HF community go long: $NVDA $MSFT $GOOG $AMZN Because these 4 companies are guaranteed to make a killing for the next decade Regardless of what happens in AI
I hate consensus, so I look for ways to play the same trade at 50% of the valuation To me, this is China Tech $NVDA = $SMICY $MSFT = $TCEHY $GOOG = $BIDU $AMZN = $BABA
ALSO, I haven't touched on other huge bottlenecks like: - Rare Earths - Energy production - Embodied AI scalability China crushes in all 3 and has: - All rare earth minerals - All rare earth processing - Unlimited clean energy (nuclear) - 99% Robotics production capacity
My point is, it's inevitable that China will catch up on the LLM front (whether through closed model like Bytedance or open source like Qwen DeepSeek etc.) So the next question becomes - what happens AFTER the models plateau? It comes down to atoms. It comes down to scale.
To round out this thread, this is a great summary of the problem with AI wrappers
@zoomyzoomm The 2009 Tesla Roadster had a base price of $109,000 and they lost money on every car. Yet, today, we have electric cars everywhere. Tech gets cheaper as it scales. So while AI asset prices are deranged and current costs are subsidized, it's clearly a real technology that will
@Molson_Hart Agree
@zoomyzoomm I think I'm going to need to be posting this a lot in the coming months. Any perceived efficiency gains come from giving away compute. Once the losses pile up and costs are finally passed through to the end user, it will no longer be an economical choice for any task.
@zoomyzoomm You extrapolate that it will always be this costly and they wouldn’t optimize it.
@bizlet7 I'm expecting a 10x decrease in cost of inference And the unit economics are still dicey The problem is that the cost to train models will 10x to get to GPT 6+ Which implies the LLMs will need to raise margins significantly to cover breakeven https://x.com/zoomyzoomm/statu...
@zoomyzoomm you’re the first person to figure this out
@rvc330 Thanks for the support fam
@zoomyzoomm Pfff you arent counting all the businesses people create and profit from…
@zoomyzoomm It all gets fixed by Moore’s Law. But yeah, a retarded investment for retail.
@zoomyzoomm Hundreds of thousands pay $100/month to use AI for driving and that is definitely profitable. Millions more pay $1+/minute to use AI to be driven and that is likely gross margin positive by now. No VC funding in either (anymore).
@zoomyzoomm you must not know how innovation has happened the last 50 years... or really for past 300
@zoomyzoomm Let me run it on device! Everyone selling you something has a flashy UI but any on device solutions are rusty to say the least. Paging @Meta !
@zoomyzoomm There's no problem. This rings Facebook/Amazon naysayers complaining about lack of profitability for years. Taking early losses makes it much easier to get early adoption and change behavior Compute and energy costs will crater to accommodate, meanwhile AI companies will
@zoomyzoomm Why did you make this retarded thread lol. You’re not even trying to monetize. “Costs must come in line with mass affordability before adoption can take place. And they must do it before funding ends…” Yes this is true for every single revolutionary invention. Good news is
@zoomyzoomm I see the problem. You used fabricated numbers.
@zoomyzoomm Can ads save the day? Like they have done for Google search, which was losing money at the beginning?
@zoomyzoomm The cost of tokens is going to be a big wall in the future. Learning how to fine tune your AI with prompt engineering and similar to minimize the token cost is gonna be important.
@zoomyzoomm I’d pay at least 10X more for the AI I use. I already pay $500 per month.
@zoomyzoomm Cursor is doomed because it ultimately can't compete against its providers on cost and has been shoveling VC money to them in exactly the way you describe. But that doesn't stop providers rapidly becoming sustainable. OAI just brought the cost of an Opus-level coding model down
@zoomyzoomm Despite this practice being illegal, is used by all public companies to pump up revenue and scam stock holders
@zoomyzoomm People all said that about Uber 12 years ago - we are in the hyper growth phase where market share is king. Costs will come down the experience curve and pricing will rationalize over the next few years.
@zoomyzoomm Eventually, like all Technology, we will find ways to make it cheaper. But you gotta start to find the Business Model early so you can start refining it.

