读者旁友们:
我在推上读到了 @Dr_Gingerballs关于科技股泡沫的分析,推荐给大家。
MJ插图:人工智能能否识别人工智能的泡沫?
提高财商|关注未来|管理风险
The semiconductor industry is screaming at us that AI is all hype. Yes, lots of orders for H100 GPUs, but even an AI datacenter still needs CPUs, memory, and HDs. Then you have AI implementation, such as in computer vision and controls, robotics, etc, which require specialized chips for efficient compute.
The companies that provide these things are all telling us that AI is not going to deliver the types of growth people are foaming at the mouth for.
$INTC revenues down 14% YoY. Net earnings down 78% YoY. They make the CPUs that go in the servers.
$AMD releases earnings next week which are pivotal, but Q3 showed revenues down 8% YTD, earnings down 85% YTD. They also make CPUs that go in the servers.
$MU makes memory. Revenues up 15% YoY but they have had negative earnings for over a year.
Samsung memory chip earnings have dropped 78% YoY.
$WDC also makes hard drives and memory. They just posted revenues down 2% YoY and more than a year of net losses.
$TXN makes all sorts of chips necessary for AI, such as computer vision processors and robot automation chips. Their revenues are down 13% YoY and their net income is down 30% YoY.
In short, hardware sucks for AI. Except for 1 company. $NVDA. Revenue up 206% YoY, net income up 1200% YoY. Even $TSMC, the company that makes the chips for $NVDA, has had a 1.5% YoY revenue decline and an earnings decline of 19% YoY.
$NVDA has so far this year sold about $30B of datacenter equipment. These are primarily A100 and H100 chips, which are 5nm processes. $TSMC reports that about 35% of chips they ship are 5nm in 2nd half 2023. Before that it was variable (as apple moves down to 3nm and others move from 7 to 5), but the lowest number was around 20%. So let's give $NVDA the benefit of the doubt and say 15% of the chips they make are likely for $NVDA and lets say half are for data centers. All told this year, that's $5B of revenue for $TSMC to make A100 and H100 chips.
Then on the $NVDA side, they have made around $30B on AI chips this year, putting their margin around 600%. Others have estimated 800%. So nobody, not even $NVDA's supplier, has ANY pricing power to accommodate this demand. Nobody is making money except for $NVDA.
Then we move to the cloud side. Pricing is pretty transparent for some specialty LLM AI cloud services like Coreweave and Lambda Labs. They have been selling compute on the H100 for about $100/day. If they are paying $40k a pop, then it will take over a year just to break even on the chip, and most likely will take 2-3 years. This is longer than the likely life of the chip. So the cloud providers also do not have the pricing power to make money off of AI.
At one point I looked through the listed AI partners for $ORCL, who has been given preferential treatment by $NVDA since they aren't trying to compete with their own chips. The partners of any significant size are all companies that $NVDA has heavily invested in, or Larry Ellison's own bio institute. Hardly a who's who of titans of organic demand.
In fact, everywhere you look, most of the activity is localized to private start ups. Many of them are funded in part by $NVDA. I have even found some start ups that boast having more H100 chips than their entire raised capital to date. There's little transparency into their books, and many of them likely are not profitable, nor never will be.
So to summarize. The people who make hardware for AI compute are all losing money except for $NVDA. The people who provide cloud compute to train LLMs are selling H100 compute time at a loss. The people using the cloud compute are likely running on fumes with venture capital money, some of which came from $NVDA.
So the industry is absolutely running on hype, as people scramble to capture the money flowing out of VC funds. To be fair, there are some cool things coming out of these LLMs. Nearly instant voice translations are pretty cool. The chat bot is sort of neat. I've heard some good things about the code generation, although admittedly reviews are very mixed and polarized. The image generation seems cool but it looks likely that they will be required to pay royalties to artists. None of these things is $1T a year industry.
But that's not the hype people have been sold. They have been sold on the imminent creation of artificial general intelligence. An intelligence so advanced it can out-perform a human in cognitive and perhaps physical tasks. The problem? LLMs cannot, by themselves, ever achieve this. They are just extremely large regressions to existing data. They simply collage together an output based on an input. It cannot reason. It cannot think. It cannot truly create something outside of its training. It's missing fundamental components of intelligence that nobody has solutions for.
So you have a fundamental misunderstanding by investors about the capabilities of the technology, fueled by a technologically ignorant media, on one side. And mega-cap technology companies willing to spend vast amounts of money to maintain dominance (and find palatable excuses to raise prices without drawing too much ire for anti-trust practices) on the other side. The result is a fever dream where only those hyping the technology appear to be making any money, and not the people building it.
Eventually the fever will break and we will be left with a massive misallocation of capital and a lot of broken hearts.
Buyer beware.
优质信息|准确认知|理性决策|有效行动|良好结果
邮件组订阅年费人民币价格为999元(微信或支付宝);
通过官网用PayPal订阅价格为126美元;
介绍新人在官网订购邮件组,注册时务必要写对推荐人/您的电邮地址。订阅成功后您的订阅期限顺延三十天。
Keep reading with a 7-day free trial
Subscribe to 光盐财经 to keep reading this post and get 7 days of free access to the full post archives.