CashNews.co
Practically everyone building an artificial intelligence (AI) large language model relies on Nvidia (NASDAQ: NVDA) for its high-end graphics processing units (GPUs). Big tech companies like Alphabet, Microsoft, Amazon, Meta Platforms, and others have been committing billions of dollars to Nvidia just to get their hands on the next batch of chips.
But Apple (NASDAQ: AAPL), true to its roots, thinks differently.
The company revealed its long-awaited AI features in early June. It’s calling the system Apple Intelligence, which consists of multiple generative AI models fine-tuned for various tasks, such as writing, summarizing, image generation, and interacting with iPhone and Mac apps. But those models weren’t trained using Nvidia GPUs. They weren’t trained using GPUs at all.
Building the Apple Foundation Model
Apple used Google’s Tensor Processing Units (TPUs) to train its Apple Foundation Models, which power Apple Intelligence. A TPU is a special type of computer chip called an application-specific integrated circuit, or ASIC. Unlike a GPU, an ASIC chip is relatively limited in what kind of processes it can run. But it runs those processes extremely efficiently.
Apple used the fourth and fifth generations of Google’s TPUs to train its foundation models. Google says the TPU v5p, released late last year, is 2.8 times faster than the fourth-generation design. And Google’s TPU v4 was between 1.2 times and 1.7 times faster than the Nvidia A100. Some back-of-the-envelope math then puts the v5p chips on par with Nvidia’s latest H100 chips. In May, Google introduced its sixth-generation TPU, which produces even more speed improvements.
In other words, when it comes to training a foundation model for generative AI, Google’s own designs appear to be just as good as Nvidia’s latest hardware. More importantly, they’re more energy- and cost-efficient than using a GPU. As a result, Apple can end up with a foundation model for Apple Intelligence at a fraction of the price it would’ve cost to use Nvidia’s chips.
Apple certainly isn’t the only company using Google TPUs for training its AI, but the fact that one of the wealthiest companies in the world decided to go with TPUs over Nvidia should send a major warning to Nvidia investors.
Google isn’t the only one making AI chips
Nvidia relies on a small group of tech giants for a large part of its business. Google is one of its biggest customers despite the fact that it’s spent a decade designing its own AI chips. Likewise, Nvidia’s other biggest customers — Microsoft, Amazon, and Meta — are all designing their own custom silicon for training and running AI models.
Alphabet CEO Sundar Pichai may have summarized the biggest reason these companies continue to purchase every chip they can from Nvidia on his company’s second-quarter earnings call. “When we go through a curve like this, the risk of under-investing is dramatically greater than the risk of over-investing for us.” In other words, the potential loss from not spending heavily on AI chips and data centers is far greater than the cost of spending.
But ultimately, Google TPUs and other custom silicon may prove a more cost-effective way to continue scaling the AI businesses of these tech giants. And Apple’s decision to use TPUs is a major testament to that.
Customer concentration is a major risk for Nvidia. During Q1, two customers accounted for 24% of its total revenue. As these customers work to reduce their dependence on Nvidia, the chip designer could see its revenue growth slow significantly over the next few years.
Considering the expectations for Nvidia are high, that makes the potential for more AI developers to use cost-effective ASICs instead of Nvidia GPUs a serious threat to the stock price. One bad quarter could send shares considerably lower.
Even after the pullback in price, Nvidia shares trade for over 42 times forward earnings and an enterprise value-to-revenue multiple of around 34. Both are extremely high for a company of its size, and while it’s managed to beat expectations amid the ongoing AI boom, the potential for a slowdown in sales and profits keeps getting stronger. Nvidia’s AI dominance won’t last forever, but investors are acting like it will, based on its current share price.
Should you invest $1,000 in Nvidia right now?
Before you buy stock in Nvidia, consider this:
The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Nvidia wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.
Consider when Nvidia made this list on April 15, 2005… if you invested $1,000 at the time of our recommendation, you’d have $717,050!*
Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. The Stock Advisor service has more than quadrupled the return of S&P 500 since 2002*.
See the 10 stocks »
*Stock Advisor returns as of July 29, 2024
John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Randi Zuckerberg, a former director of market development and spokeswoman for Facebook and sister to Meta Platforms CEO Mark Zuckerberg, is a member of The Motley Fool’s board of directors. Adam Levy has positions in Alphabet, Amazon, Apple, Meta Platforms, and Microsoft. The Motley Fool has positions in and recommends Alphabet, Amazon, Apple, Meta Platforms, Microsoft, and Nvidia. The Motley Fool recommends the following options: long January 2026 $395 calls on Microsoft and short January 2026 $405 calls on Microsoft. The Motley Fool has a disclosure policy.
Apple Just Sent a Major Warning to Nvidia Investors was originally published by The Motley Fool
#cashnews #UnitedStates #newsfinace #finance #FollowsCashnews