Sanjay Mehrotra
President and Chief Executive Officer at Micron Technology
Thank you, Satya. Good afternoon, everyone. In fiscal Q1, Micron delivered revenue, gross margin and EPS above the high end of the guidance ranges we provided at the last earnings call, reflecting Micron's strong execution combined with improved pricing.
We are in the very early stages of a multiyear growth phase catalyzed and driven by generative AI, and this disruptive technology will eventually transform every aspect of business and society. Memory is at the heart of GPU-enabled AI servers, and we are already seeing strong demand driven by early deployment of AI solutions, which will only accelerate over time.
Micron is well positioned to leverage this growth, having executed the most robust set of new technology and product introductions in our 45-year history. The improved supply/demand environment in the current calendar quarter gives us additional confidence in the trajectory of our business. We have driven a strong inflection in industry pricing this calendar quarter, which will allow us to benefit from higher prices earlier in our fiscal year compared to our prior plans.
We intend to stay very disciplined with our supply and capacity investments as our pricing is still far from levels associated with necessary ROI. We expect our pricing to continue to strengthen through the course of calendar 2024. We expect improved margins and financial performance throughout 2024 and record industry TAM in calendar 2025.
We have made significant progress with our industry-leading technology roadmap. Micron is at the forefront of ramping the industry's most advanced technology nodes in both DRAM and NAND. The vast majority of our bits are on leading-edge nodes: 1-alpha and 1-beta in DRAM and 176-layer and 232-layer in NAND. As previously stated, both 1-beta DRAM and 232-layer NAND nodes have reached mature yields faster than the prior nodes. We expect fiscal 2024 front-end cost reductions to track in line with our long-term expectations of mid- to high-single digits in DRAM and low-teens in NAND. We are on track for volume production in 1-gamma DRAM using EUV in calendar 2025.
Now turning to our end markets. Inventories for memory and storage are at or near normal levels for most customers across PC, mobile, auto and industrial end markets. Consequently, the demand that we see from customers in these markets is closer to their end-market demand.
Data center customer inventory of memory and storage is improving and we continue to expect customer inventory to approach normal levels in this market sometime in the first half of calendar 2024. Across our data center and PC markets, we are ahead of the industry in our transition to D5, and we expect to cross over our D5 volume from D4 in early calendar 2024.
Generative AI use cases are expanding from the data center to the edge with several recent announcements of AI-enabled PCs, smartphones with on-device AI capabilities as well as embedded AI in the auto and industrial end markets. The proliferation of on-device AI at the edge offers a host of benefits such as enhanced privacy, lower latency, improved performance, greater personalization and competitive costs for a wide range of use cases from content creation to productivity.
We see a rapid evolution in our customer product roadmaps, enabling and leveraging this AI market expansion, which in turn is driving higher capacity, lower power and increased performance requirements for memory and storage. We expect to increasingly benefit from content growth as these trends in the AI gain momentum.
In data center, total server unit shipments are expected to increase by a mid-single-digit percentage in calendar 2024, following a year of low-double-digit percentage decline in calendar 2023. Demand for AI servers has been strong as data center infrastructure operators shift budgets from traditional servers to more content-rich AI servers. Also, in response to AI-driven data center demand, several customers have announced aggressive roadmaps for new GPU and AI accelerator ASIC product introductions with increasing requirements for HBM capacity, performance and power.
Micron is addressing these exciting opportunities brought on by the proliferation of AI with an industry-leading portfolio of data center solutions, including HBM3E, D5, several types of high-capacity server memory modules, LPDRAM and data center SSDs. We have received very positive customer feedback on our HBM3E, which has approximately 10% better performance and about 30% lower power consumption compared to competitive offerings of HBM3E. In fiscal Q1, we shipped samples of HBM3E to a number of key partners and are making good progress in our qualifications.
Micron is in the final stages of qualifying our industry-leading HBM3E to be used in NVIDIA's next-generation Grace Hopper GH200 and H200 platforms. In addition, our LP5X is being used for the Grace CPU, driving a new use case for LP memory in the data center for accelerated computing.
We are on track to begin our HBM3E volume production ramp in early calendar 2024 and to generate several hundred millions of dollars of HBM revenue in fiscal 2024. We expect continued HBM revenue growth in 2025, and we continue to expect that our HBM market share will match our overall DRAM bit share sometime in calendar 2025.
Last month, we introduced the industry's fastest and lowest latency 128 gigabyte high-capacity modules built on our industry-leading 1-beta node and using a monolithic die, which does not require 3D stacking and thus enables a simpler process flow for assembly. Featuring best-in-class performance, our solution will support customers' memory-intensive data center workloads today and into the future.
Additionally, leading CPU vendors have confirmed validation support for our monolithic die-based 128-gigabyte modules on existing platforms released in 2022 and 2023 as well as upcoming new platforms. This ensures that our offering has a significant TAM that we can address immediately. We expect volume production to start next quarter with significant growth in fiscal 2025 and beyond.
A testament to our solid execution and superior offerings, Micron ended the third calendar quarter with a record-high revenue share in data center SSDs, based on independent industry assessments. This marks the second consecutive quarter of record revenue share in data center SSDs and we look to build on this revenue momentum through fiscal 2024.
In PCs, we forecast unit volumes to grow by a low- to mid-single-digit percentage in calendar 2024 after two years of double-digit percentage PC unit volume declines. We expect PC OEMs to start ramping AI on device PCs in the second half of calendar 2024 with an additional capacity of 4 gigabytes to 8 gigabytes of DRAM per unit and we see average SSD capacities increasing as well. We also completed qualifications for our industry-leading 1-beta based 16 gigabit D5 at several PC customers in fiscal Q1.
In fiscal Q1, we achieved record bit shipments in both client and consumer SSDs as customers adopted our industry-leading solutions. Building upon our QLC leadership, our client SSD QLC bit shipments also reached a new record in fiscal Q1. QLC now comprises the majority of our bit shipments mix for both client and consumer SSD.
This month, we also announced that we are shipping the Micron 3500 NVMe SSD, the world's first performance client SSD with 200-plus layer NAND. Built on our industry-leading 232-layer NAND, the 3500 will help our customers handle demanding workloads for business applications, scientific computing, gaming and content creation.
In mobile, smartphone demand is showing signs of recovery, and we forecast smartphone unit shipments to grow modestly in calendar 2024. Leading chipset vendors have announced powerful new products supporting on-device large language models with 10 billion or more parameters. We expect smartphone OEMs to start ramping AI-enabled smartphones in 2024 with an additional capacity of 4 gigabyte to 8 gigabyte of DRAM per unit.
Longer term, many popular generative AI applications will be on smartphones, and our leading product portfolio is poised to capture this memory and storage opportunity. Our new industry-leading 9.6 gigabit per second LP5X will address the bandwidth requirements of the most demanding AI-based mobile applications. We also began sampling our next-generation 232-layer NAND UFS 3.1 and our 1-beta DRAM 24-gigabit LP5X to support the memory needs of emerging AI foundational models.
Last, I'll cover auto and industrial, which are end markets we value as part of the portfolio due to their relatively more predictable revenue and profitability and long-term growth opportunity. The proliferation of AI at the edge continues to increase in the industrial and auto markets. For memory, this translates to content growth in a host of AI-enabled edge devices. For example, AI-enabled industrial PCs have 3x to 5x more memory than standard PCs, and there is an 8x increase in memory content for AI-enabled edge video security cameras compared to standard non-AI video cameras.
Our automotive business achieved a new quarterly revenue record in fiscal Q1, driven by better demand and volume ramps of new vehicle platforms. As a leader in automotive market share and quality, Micron will benefit from memory and storage content growth as automotive OEMs expand features in ADAS and in-cabin applications. Our automotive design win trajectory remains strong.
Our industrial business saw a double-digit sequential growth in fiscal Q1 as the industrial market continued to recover. Inventory levels for memory and storage continued to improve at distribution partners and are at normal levels at the majority of our customers. Industry fundamentals remain strong for memory and storage as a widespread adoption of IoT, AI and machine-learning solutions create new growth opportunities for us.
Now turning to our market outlook, starting with demand. We expect calendar 2023 DRAM bit demand to grow in the high-single-digit percentage range, up from prior expectations for mid-single-digit growth. In NAND, we continue to expect calendar 2023 bit demand growth in the high-teens percentage range. Looking forward, over the next few years, we expect bit demand growth CAGRs of mid-teens in DRAM and low-20s percentage range in NAND. We forecast calendar 2024 bit demand growth for the industry to be near the long-term CAGR for DRAM and somewhat below the long-term CAGR for NAND.
Turning to supply. Significant supply reductions across the industry have enabled the recovery that is now underway. An extended period of supply growth less than demand growth would strengthen the pace of recovery. Micron will continue to exercise supply and capex discipline, aligned with our strategy to maintain our long-term bit market share for DRAM and NAND.
Micron's fiscal 2024 capex is projected to be between $7.5 billion and $8 billion, slightly higher than last year's levels and prior plans, primarily to support the HBM3E production ramp. We continue to expect WFE capex in fiscal 2024 to be down year-over-year.
As we have discussed previously, the ramp of HBM production will constrain supply growth in non-HBM products and will help improve the overall DRAM industry supply/demand balance. Across the industry, the HBM3E die is roughly twice the size of equivalent-capacity D5. Additionally, the HBM product includes a logic interface die and has a substantially more complex packaging stack that impacts yields. These factors result in HBM consuming more than two times the wafer supply as D5 to produce a given number of bits.
In last quarter's earnings call, we communicated that we strategically diverted underutilized equipment toward ramping new technology nodes, which will help us increase leading-edge production in a capital-efficient manner. Since the number of wafer processing steps is higher for leading-edge nodes, this approach of diverting underutilized tools to the leading edge meaningfully reduces our overall wafer capacity. Thus, underutilization in our fabs early this fiscal year transitions to structurally lower wafer capacity at higher utilization rates as we move through the fiscal year. Reports indicate that this redeployment of underutilized tools at the leading edge is an industry-wide practice that is likely to constrain industry supply in 2024.
Taking all these factors into account, Micron's bit supply growth in fiscal 2024 is planned to be well below demand growth for both DRAM and NAND, and we expect to decrease our days of inventory in fiscal year 2024. We expect calendar 2024 industry supply to be below demand for both DRAM and NAND, which will result in a contraction of industry inventory levels.
As we have highlighted before, we continue to work with the U.S. government and CHIPS grants are assumed in our capex plans for fiscal 2024. The viability and global competitiveness of our Idaho and New York projects depends on Micron receiving CHIPS grants to address the cost difference compared to overseas expansion.
To better support our customers around the globe, we have opened state-of-the-art assembly and test facilities in Malaysia and Taiwan. We are proceeding with our previously announced expansion of our Xi'an facility, having received approval from Chinese authorities for our planned investment. In fiscal Q1, we achieved the first mobile customer qualification of LPDRAM assembled at our Xi'an site, furthering our strong commitment to serve our mobile customers in China.
Our broad, diverse network of global operations remains a key element of our strategy to address customer demand in a reliable and resilient fashion. Our leading technology, strengthening product portfolio, strong manufacturing capabilities and our dedicated team members position us well to capture the opportunities ahead.
I will now turn it over to Mark for our financial results and outlook.