A flurry of recent bad news from the world of semiconductors has been a rare blemish during an otherwise stellar stretch for the economy—and for stocks.
At the end of October, two major chip suppliers said they saw weakness in auto demand for the first time, with
On Semiconductor
blaming the softness on higher interest rates for car loans.
Texas Instruments
provided disappointing guidance, saying the industrial sector was particularly soft.
For a shrinking set of bears, those warnings are an indication that the economy isn’t on as strong a footing as many now think. Meanwhile, with the
S&P 500
index now 5% away from an all-time high, stock-picking could become trickier in the months ahead.
But even if the global economy does sputter next year, there may be one technology area that looks relatively insulated: the chips and products required to build out the data centers that serve the soaring demand for artificial intelligence.
It’s essential to focus on the right companies to ride the trend. Significant computing shifts, including the internet, mobile, and cloud computing, turned out to be “winner take most” markets. Rival search engines competing with Google faltered, and the Apple iPhone took nearly all of the smartphone industry’s profits. AI probably won’t be different.
While general-use consumer-oriented chatbots like OpenAI’s ChatGPT have garnered much attention, the more significant opportunity may be in the enterprise.
Nvidia
is the clear—and most obvious—beneficiary from the AI buildout. Its stock, despite a huge run this year, remains attractively priced based on the growth outlook from the company—and Wall Street analysts.
But there are two other companies that are less well known to investors that should equally benefit in the year ahead—
Super Micro Computer
and
Vertiv Holdings.
They are set to ride Nvidia’s coattails.
Wall Street surveys of corporate technology buyers show that AI infrastructure and AI projects are the top priority for budget spending over the next three years. Earlier this year, Piper Sandler released a report that showed 75% of chief information officers were either testing or implementing AI projects. According to a recent survey of 600 corporate executives by Coatue, more than 60% of respondents said they plan to adopt new AI products.
CEOs have realized that AI could be an existential threat to their businesses if a start-up or another company figures out a way to use the technology against them. That’s why they are frantically investing in the technology to protect their market position and is a reason the structural shift to AI may be durable and last for several years.
The opportunity goes well beyond chatbots. Nvidia’s chief financial officer, Colette Kress, tells Barron’s that a new wave of enterprise AI adoption has recently begun. She says companies are developing customized AI for their respective industries using proprietary data to improve employee productivity.
“If companies build generative-AI solutions that leverage domain expertise and combine multiple data sets (both proprietary and public), then these solutions should generate meaningful time and cost savings,” wrote RBC Capital Markets’ technology team in a recent report.
The opportunity for Nvidia is enormous. The company has stated that the $1 trillion invested in global data-center infrastructure would eventually move from traditional server central processing units, or CPUs, to graphics processing units, or GPUs, that are better enabled to power the parallel computations needed for new applications like AI.
Nvidia dominates the market for GPUs used for AI applications. Jefferies analyst Mark Lipacis analyzed the September numbers from six top cloud-based service providers and found that Nvidia had an 86% market share for AI workloads, which is roughly flat versus the prior year.
Adding more computing power and larger data sets generally has resulted in better performance and results. It bodes well for the durability of Nvidia GPU sales.
Despite rising AI semiconductor competition from other large technology companies—including
Microsoft,
Amazon.com,
Alphabet’s
Google,
Intel,
and
Advanced Micro Devices
—Nvidia will probably maintain its market leadership.
A significant and often overlooked reason for Nvidia’s dominance is its software programming ecosystem, known as CUDA. Developers have built and shared AI-related tools and software libraries on Nvidia’s proprietary platform for more than a decade. It makes it easier to rapidly build AI applications, which is critical for start-ups and corporations to beat their competitors.
Think of CUDA as the operating system for AI, not unlike the role that Microsoft Windows played in the PC boom during the 1990s.
Unlike Microsoft, though, and more like its longtime rival
Apple,
Nvidia has a “full stack” model—it controls the AI experience across hardware, networking, and software. Retooling technology infrastructure to run different AI chips makes little sense when Nvidia provides the best overall package and performance with the most features.
The stock is up more than 200% this year, and yet analysts continue to underestimate the business. The company has crushed Wall Street’s sales estimates in its last three quarterly reports.
In the three months ended in October, Nvidia reported revenue of $18.1 billion, above analyst expectations of $16.2 billion and up an eye-watering 206% from the prior year.
“Our demand still continues to be quite strong,” Kress says. “At the same time, we are picking up supply.” The executive told Barron’s that Nvidia’s supply of chips would increase every quarter through fiscal 2025, addressing a key issue that has held back Nvidia’s sales in recent quarters, when supply couldn’t keep up with soaring demand.
Perhaps most importantly, Nvidia will start to release chips at a faster pace, making it harder for rivals to catch up. Last month, the company announced its new H200 Tensor Core GPU, scheduled for availability in the second quarter of 2024, which promises a boost of up to 90% in performance versus the current top-of-line H100.
Nvidia recently updated its product road map, moving from its previous two-year product cycle to a one-year cadence for its AI chips. A slide in a company presentation shows that Nvidia plans to release additional successors to the H200 product in 2024 and 2025.
“The bigger AI gets, the more solutions that will be needed, and the faster we will meet those goals and expectations,” Kress says.
To be sure, there’s uncertainty over the U.S. government’s export restrictions on certain Nvidia AI chips to China. Kress has stated that Nvidia’s sales to China will decline significantly in the current quarter, but the shortfall will be offset by other regions.
The rising demand for AI, for instance, is increasing demand for the AI servers that house all of those Nvidia chips. According to TrendForce, AI server shipments are estimated to increase by 38% this year and grow another 38% in 2024.
And that’s good news for Super Micro. The company is the leading independent manufacturer of high-end AI servers that fill the server racks inside data centers. More than half of its revenue is tied to AI.
Tesla
and
Meta Platforms
are among Super Micro customers, according to Barclays, in addition to a range of start-ups and cloud computing vendors.
Super Micro CEO Charles Liang says that customers rely on the company’s high-performance custom designs and modular systems that quickly incorporate the latest technologies into server designs. The plug-and-play nature of the servers means that Super Micro servers can incorporate new AI chips two to six months faster than rivals, he says.
Meanwhile, the company’s close partnership with Nvidia is paramount, especially when every company is scrambling to get access to Nvidia’s AI computing power.
Liang says that Super Micro has worked with Nvidia since both companies were founded in 1993. The companies’ headquarters are just 15 minutes apart in Silicon Valley.
AI isn’t just hot as a metaphorical trend. The data centers running those AI servers generate five times more heat than traditional CPU servers and require 10 times more cooling per square foot, Vertiv says.
Vertiv works to keep those temperatures under control. The company’s power and cooling infrastructure equipment to data centers now accounts for 75% of Vertiv’s business.
Vertiv CEO Giordano Albertazzi says the transition from traditional CPU-focused servers to GPU-powered AI servers will take a long time and drive growth for the foreseeable future.
As with Nvidia, Wall Street actually may be underestimating the opportunity. Analysts forecast Vertiv sales growth of 20% this year, falling to just 9% in 2024.
“We strongly believe this is going to be a multiyear wave,” Albertazzi says. “We see it in our [order] pipelines and our backlog. We see it in the intensity and frequency of conversations with data center customers.”
To be sure, none of these AI opportunities have gone unnoticed by investors. Shares of Nvidia, Super Micro, and Vertiv have more than tripled this year due to the market’s excitement over AI-related companies.
Even so, the stocks aren’t trading at particularly aggressive valuations, thanks to equally explosive earnings growth. The question now is whether the companies can keep growing over the long term.
Executives themselves, though, don’t seem worried about slowing growth “This AI revolution can be bigger than the Industrial Revolution 200 years ago,” says Super Micro’s Liang. “The impact is everywhere.”
When asked on the last earnings call if its data center business could continue to grow through 2025, Nvidia CEO Jensen Huang said, “Absolutely,” citing rising demand from corporations for its new offerings.
Historically, growth companies earn price/earnings ratios well in excess of their earnings growth, but that’s not the case with Nvidia, Super Micro, or Vertiv.
Nvidia, for instance, trades at 24 times next year’s earnings estimates, a modest number compared with the roughly estimated 75% growth rate.
Super Micro trades at just 15 times 2024 consensus earnings, versus 28% earnings growth.
Vertiv fetches 20 times; Deutsche Bank analyst Nicole DeBlase predicts that Vertiv can grow its earnings by more than 20% annually through 2025.
As investors become more comfortable with AI-related growth, these valuation multiples are likely to rise, suggesting significant upside for all three stocks.
The past 12 months have put AI front and center in every investor’s mind. Somehow, many of them are still underappreciating the opportunity. That’s not to say that everyone will win from AI. The key is finding the market leaders. Nvidia clearly fits the bill. But it’s not the only one. Among AI investors, Super Micro and Vertiv could soon be household names, as well.
Write to Tae Kim at [email protected]
Read the full article here