Sanjay Mehrotra
President and Chief Executive Officer at Micron Technology
Thank you, Satya. Good afternoon, everyone.
I am pleased to report that Micron delivered fiscal Q2 revenue, gross margin and EPS well above the high end of guidance. Micron has returned to profitability and delivered positive operating margin a quarter ahead of expectation. I would like to thank all our Micron global team members for their dedication and excellent execution that made this result possible.
Micron drove robust price increases as the supply-demand balance tightened. This improvement in market conditions was due to a confluence of factors, including strong AI server demand, a healthier demand environment in most end markets, and supply reductions across the industry. AI server demand is driving rapid growth in HBM, DDR5 and data center SSDs, which is tightening leading-edge supply availability for DRAM and NAND. This is resulting in a positive ripple effect on pricing across all memory and storage end markets. We expect DRAM and NAND pricing levels to increase further throughout calendar year 2024 and expect record revenue and much improved profitability now in fiscal year 2025.
Micron is at the forefront of ramping the industry's most advanced technology nodes in both DRAM and NAND. Reinforcing our leadership position over three quarters of our DRAM bits are now on leading-edge 1-alpha and 1-beta nodes, and over 90% of our NAND bits are on 176-layer and 232-layer nodes. We expect fiscal 2024 front-end cost reductions, excluding the impact of HBM, to track in line with our long-term expectations of mid- to high-single-digits in DRAM and low-teens in NAND, supported by the continued volume ramp of 1-beta DRAM and 232-layer NAND. We continue to mature our production capability with extreme ultraviolet lithography, and have achieved equivalent yield and quality on our 1-alpha as well as 1-beta nodes between EUV and non-EUV flows. We have begun 1-gamma DRAM pilot production using EUV and are on track for volume production in calendar 2025. The development of our next-generation NAND node is on track, with volume production planned for calendar 2025. We expect to maintain our technology leadership in NAND.
Now, turning to our end markets. Inventories for memory and storage have improved significantly in the data center, and we continue to expect normalization in the first half of calendar 2024. In PC and smartphone, there were some strategic purchases in calendar Q4 in anticipation of a return to unit growth. Inventories remain near normal levels for auto, industrial and other markets. We are in the very early innings of a multi-year growth phase driven by AI, as this disruptive technology will transform every aspect of business and society.
The race is on to create artificial general intelligence, or AGI, which will require ever-increasing model sizes with trillions of parameters. On the other end of the spectrum, there is considerable progress being made on improving AI models, so that they can run on edge devices, like PCs and smartphones, and create new and compelling capabilities.
As AI training workloads remain a driver of technology and innovation, inference growth is also rapidly accelerating. Memory and storage technologies are key enablers of AI in both training and inference workloads, and Micron is well-positioned to capitalize on these trends in both the data center and the edge. We view Micron as one of the biggest beneficiaries in the semiconductor industry of this multi-year growth opportunity driven by AI.
In data center, total industry server unit shipments are expected to grow mid- to high-single-digits in calendar 2024, driven by strong growth for AI servers and a return to modest growth for traditional servers. Micron is well-positioned with our portfolio of HBM, D5, LP5, high-capacity DIMM, CXL and data center SSD products.
Delivering improved memory bandwidth, power consumption and overall performance is critical to enable cost-efficient scaling of AI workloads inside modern GPU or ASIC accelerated AI servers. Our customers are driving an aggressive AI roadmap on their GPU and ASIC-based server platforms that require significantly higher content and higher-performance memory and storage solutions. For example, earlier this week, Nvidia announced its next-generation Blackwell GPU architecture-based AI systems, which provides a 33% increase in HBM3E content, continuing a trend of steadily increasing HBM content per GPU. Micron's industry-leading high-bandwidth memory, HBM3E solution, provides more than 20 times the memory bandwidth compared to standard D5-based DIMM server module. We are executing well on our HBM product ramp plans and have made significant progress in ramping our capacity, yields and quality. We commenced volume production and recognized our first revenue from HBM3E in fiscal Q2, and now, have begun high-volume shipments of our HBM3E product.
Customers continue to give strong feedback that our HBM3E solution has a 30% lower power consumption compared to competitors' solutions. This benefit is contributing to strong demand. Our HBM3E product will be part of Nvidia's H200 Tensor Core GPUs, and we are making progress on additional platform qualifications with multiple customers. We are on track to generate several hundred million dollars of revenue from HBM in fiscal 2024 and expect HBM revenues to be accretive to our DRAM and overall gross margins starting in the fiscal third quarter. Our HBM is sold out for calendar 2024, and the overwhelming majority of our 2025 supply has already been allocated. We continue to expect HBM bit share equivalent to our overall DRAM bit share sometime in calendar 2025.
Earlier this month, we sampled our 12-high HBM3E product, which provides 50% increased capacity of DRAM per cube to 36-gigabyte. This increase in capacity allows our customers to pack more memory per GPU, enabling more powerful AI training and inference solutions. We expect 12-high HBM3E will start ramping in high-volume production and increase in mix throughout 2025. We have a robust roadmap, and we are confident we will maintain our technology leadership with HBM4, the next generation of HBM, which will provide further performance and capacity enhancements compared to HBM3E.
We are making strong progress on our suite of high-capacity server DIMM products. During the quarter, we completed validation of the industry's first mono-die-based 128-gigabyte server DRAM module. This new product provides the industry's highest-bandwidth D5 capability, with greater than 20% better energy efficiency and over 15% improved latency performance compared to competitors' 3D TSV-based solutions. We see strong customer pull and expect a robust volume ramp for our 128-gigabyte product, with several hundred million dollars of revenue in the second half of fiscal 2024. Additionally, we also started sampling our 256-gigabyte MCRDIMM module, which further enhances performance and increases DRAM content per server. We achieved record revenue share in the data center SSD market in calendar 2023. During the quarter, we grew our revenue by over 50% sequentially for our 232-layer-based 6500 30 terabytes SSDs, which offer best-in-class performance, reliability and endurance for AI data lake applications.
In PC, after two years of double-digit declines, unit volumes are expected to grow modestly in the low-single-digit range for calendar 2024. We are encouraged by the strong ecosystem momentum to develop next-generation AI PCs, which feature high-performance Neural Processing Unit chipsets and 40% to 80% more DRAM content versus today's average PCs. We expect next-generation AI PC units to grow and become a meaningful portion of total PC units in calendar 2025.
At CES, the Consumer Electronics Show in Las Vegas, Micron launched the industry's first low-power compression-attached memory module, or LPCAMM2, for PC applications. LPCAMM2 brings a modular form factor, with a maximum capacity point of 64-gigabyte for PC module and 128-gigabyte for server module, along with a number of benefits such as higher bandwidth, lower power and smaller form factor.
During the quarter, we launched our 232-layer-based Crucial T705 Gen 5 consumer SSD, which won several editor choice awards and was recognized by a leading publisher as the fastest M.2 SSD ever. We increased our client SSD QLC bit shipments to record levels, with QLC representing nearly two-thirds of our client SSD shipments, firmly establishing Micron as the leader in client QLC SSDs.
Turning to mobile. Smartphone unit volumes in calendar 2024 remain on track to grow low- to mid-single-digits. Smartphones offer tremendous potential for personalized AI capabilities that offer greater security and responsiveness when executed on device. Enabling these on-device AI capabilities is driving increased memory and storage capacity needs and increasing demand for new value-add solutions. For example, we expect AI phones to carry 50% to 100% greater DRAM content compared to non-AI flagship phones today.
Micron's leading mobile solutions provide the critical high performance and power efficiency needed to unlock an unprecedented level of AI capability. In DRAM, we are now sampling our second generation, 1-beta LPDRAM LP5X product, which delivers the industry's highest performance at improved power for flagship smartphones. And in NAND, we announced our second generation of 232-layer NAND UFS 4.0 devices, featuring the industry's smallest package and breakthrough features that enable greater reliability and significantly higher real-world performance for complex workloads.
Our mobile DRAM and NAND solutions are now widely adopted in industry-leading flagship smartphones, with two examples being Samsung's Galaxy S24 and the Honor Magic 6 Pro announced this year. The Samsung Galaxy S24 can provide two-way, real-time voice and text translations during live phone calls. The Honor Magic 6 Pro features the Magic LM, a 7-billion parameter large language model, which can intelligently understand a user's intent based on language, image, eye movement and gestures, and proactively offer services to enhance and simplify the user experience.
Turning to auto and industrial. The automotive sector continues to experience robust demand for memory and storage as non-memory semiconductor supply constraints have eased and as new vehicle platforms are launched. In the past quarter, we experienced strong growth with partners who are driving the most advanced capabilities within the automobile's increasingly intelligent and connected digital cockpits. In addition, adoption of Level 2+ ADAS capabilities continues to gain momentum, further expanding content per vehicle. The industrial market fundamentals for memory are also healthy, with improving distributor inventory, book-to-bill and demand visibility improvements, as well as pricing benefits from the tight supply for products, especially those built on leading-edge nodes.
Now, turning to our market outlook. Calendar 2023 DRAM bit demand growth was in the low-double-digit percentage range, and NAND bit demand growth was in the low-20s percentage range, both a few percentage points higher than previous expectations. We forecast calendar 2024 bit demand growth for the industry to be near the long-term CAGR for DRAM and around mid-teens for NAND. Given the higher baseline of 2023 demand, these expectations of 2024 bit growth have driven an increase in the absolute level of 2024 bit demand in our model for DRAM and NAND versus our prior expectations. The industry supply-demand balance is tight for DRAM and NAND, and our outlook for pricing has increased for calendar 2024. Over the medium term, we expect bit demand growth CAGRs of mid-teens in DRAM and long -- low-20s percentage range in NAND.
Turning to supply. The supply outlook remains roughly the same as last quarter. We expect calendar 2024 industry supply to be below demand for both DRAM and NAND. Micron's bit supply growth in fiscal 2024 remains below our demand growth for both DRAM and NAND, and we expect to decrease our days of inventory in fiscal year 2024. Micron's fiscal 2024 capex plan remains unchanged at a range between $7.5 billion and $8.0 billion. We continue to project our WFE spending will be down year-on-year in fiscal 2024.
Micron's capital-efficient approach to reuse equipment from older nodes to support conversions to leading-edge nodes has resulted in a material structural reduction of our DRAM and NAND wafer capacities. We are now fully utilized on our high-volume manufacturing nodes and are maximizing output against the structurally lowered capacity. We believe this approach to node migration and consequent wafer capacity reduction is an industry-wide phenomenon. We project to end fiscal 2024 with low-double-digit percentage less wafer capacity in both DRAM and NAND than our peak levels in fiscal 2022.
Significant supply reductions across the industry have enabled the pricing recovery that is now underway. Although our financial performance has improved, our current profitability levels are still well below our long-term targets, and significantly improved profitability is required to support the R&D and capex investments needed for long-term innovation and supply growth. Micron will continue to exercise supply and capex discipline and focus on restoring improved profitability, while maintaining our bit market share for DRAM and NAND.
As discussed previously, the ramp of HBM production will constrain supply growth in non-HBM products. Industry-wide, HBM3E consumes approximately three times the wafer supply as D5 to produce a given number of bits in the same technology node. With increased performance and packaging complexity, across the industry, we expect the trade ratio for HBM4 to be even higher than the trade ratio for HBM3E. We anticipate strong HBM demand due to AI, combined with increasing silicon intensity of the HBM roadmap, to contribute to tight supply conditions for DRAM across all end markets.
Finally, as we consider these demand and technology trends, we are carefully planning our global fab and assembly/test capacity requirements to ensure a diversified and cost-competitive manufacturing footprint. Announced projects in China, India and Japan are proceeding as planned. On potential U.S. expansion plans, we have assumed CHIPS grants in our capex plans for fiscal 2024. Our planned Idaho and New York projects require Micron to receive the combination of sufficient CHIPS grants, investment tax credits and local incentives to address the cost difference compared to overseas expansion.
I will now turn it over to Mark for our financial results and outlook.