The demand for GPUs, accelerators, memory and interconnect
chips for the rollout of AI in data centres has driven the industry to boom levels,
elevating Nvidia and foundry TSMC to the very top.
This boom is welcome in that it pays for the next couple of
generations of process technology development. Designs on 2nm, 18A and 16A
process nodes will reduce power consumption and boost performance, both
prerequisites for AI datacentre applications, but the costs of designing and
making such chips, from the mask sets to the high NA lithography, are eye
wateringly high.
Recent market analysis and financial results are showing
that this boom in AI chips is hiding a significant weakness in other areas.
This was particularly apparent in discussion about the UK and European
distribution market, which services the rest of the industry. The AI chips and
high performance HBM memories are all shipped direct to server makers and so do
not show up in distribution
2025 will be a year of no growth for distributors in UK and
Ireland according to the latest forecast from the Electronic Components Supply
Network (ecsn). This follows a collapse in 2024 with a fall of 4.9% as the
result of excess inventory.
The weakness also became apparent this week in results from
Micron, Broadcom and Marvell.
Micron is warning of reduced profits next year as the underlying demand for memory in industrial and consumer applications is weak and unlikely to recover until the second half of 2025. This comes despite a doubling of quarterly revenue in the last year on the back of the AI boom.
“Data centre revenue grew over 400% year over year and 40%
sequentially, reaching a record level with data centre revenue mix surpassing
50% of Micron's revenue for the first time,” said Sanjay Mehrotra, president
and CEO of Micron Technology.
“We are now seeing a more pronounced impact of customer
inventory reductions. As a result, our fiscal Q2 bit shipment outlook is weaker
than we previously expected. We expect this adjustment period to be relatively
brief and anticipate customer inventories reaching healthier levels by spring,
enabling stronger bit shipments in the second half of fiscal and calendar 2025,”
he added.
The PC refresh cycle is unfolding more gradually, and Micron
expects PC shipments to be flat this year. However Mehrota remains optimistic
about AI PC adoption over time. AI PCs will require additional DRAM content
with a minimum of 16Gbyte of DRAM for entry-level PCs and 24Gbytes and above
for the higher-end segments versus 12Gbytes for PCs last year.
The slump in automotive and industrial has also hit.
“Lower-than-expected automotive unit production, combined
with a shift toward value-trim vehicles from premium models and EVs, has slowed
memory and storage content growth and resulted in inventory adjustments at
OEMs. Longer term, we remain optimistic that ADAS, infotainment, and AI
adoption across auto will drive long-term memory and storage content growth.
Industrial market demand continues to be impacted by inventory adjustments, and
we expect a recovery in this market later in calendar 2025,” he said.
Broadcom similarly is benefitting from the boom for itsinterconnect chips and its purchase of VMware, which provides virtual machine
infrastructure for all those datacentre CPUs. It saw revenues up 44% year over
year to a record $51.6bn, but excluding VMware, that was 9%.
However a closer look shows the same issues as Micron. The
AI revenue, which came from strength in custom AI accelerators or XPUs and
networking, grew 220% from $3.8 billion in fiscal 2023 to $12.2 billion in
fiscal 2024 and represented 41% of the semiconductor revenue. This drove
semiconductor revenue up to a record $30.1 billion during the year, says CEO Hock
Tan.
The wireless business grew 7% with new WiFi7 access points,
while the broadband business fell by 51%.
“So, the reality going forward for this company is that the AI semiconductor business will rapidly outgrow the non-AI semiconductor business,” said Tan. “Q4 AI revenue grew a strong 150% year on year to $3.7 billion. Non-AI semiconductor revenue declined by 23% year on year to $4.5 billion, but still a 10% recovery from the bottom of six months ago.
“On the broad portfolio of non-AI semiconductors with its
multiple end markets, we saw a cyclical bottom in fiscal 2024 at $17.8 billion.
We only expect a recovery in the second half of 2025 at the industry's
historical growth rate of mid-single digits.”
“In sharp contrast, we see our opportunity over the next
three years in AI as massive. Specific hyperscalers have begun their respective
journeys to develop their own custom AI accelerators or XPUs, as well as
network these XPUs with open and scalable Ethernet connectivity. For each of
them, this represents a multiyear, not a quarter-to-quarter journey,” he said.
Broadcom has three hyper-scale customers who have developed
their own multi-generational AI XPU road map to be deployed at varying rates
over the next three years. In 2027, we believe each of them plans to deploy one
million XPU clusters across a single fabric.
“Keep in mind though, this will not be a linear ramp. We'll
show quarterly variability. To compound this, we have been selected by two
additional hyperscalers and are in advanced development for their own
next-generation AI XPUs. We have line of sight to develop these prospects into
revenue-generating customers before 2027,” said Tan.
Marvell is taking even more drastic measures to focus on AI.
“Marvell is entering a new era of growth through the substantial volume production ramp of our custom silicon programs, along with continued strong growth in optics,” said Matthew Murphy, Chair, President, and CEO of Marvell.
The company is expanding its strategic relationship with
Amazon Web Services for data centre semiconductors, including custom AI
products, optical DSPs, active electrical cable DSPs, PCIe retimers,
interconnect optical modules, and Ethernet switching silicon solutions. It is
also working with AWS for EDA in the cloud to accelerate silicon design, with
significant interest in 2nm designs.
“In this third quarter, we made decisions to further
solidify and purposefully redirect our investments toward data centre relative
to our other end markets. These actions resulted in a restructuring charge in
the third quarter, which Willem will discuss in his section. The goal of these
actions is to increase our R&D intensity toward the data centre, our
largest and fastest-growing opportunity,”
Will edge AI be a saving grace? There is a strong focus on
AI technology migrating to the edge of the network with lower cost, lower power
implementations. That is likely to drive more local production of server
boards, boosting more local economies.
www.micron.com; www.broadcom.com; www.marvell.com
No comments:
Post a Comment