Artificial intelligence (AI) has transformed major industries, including healthcare, finance, retail, automobile, and manufacturing. Nvidia Corporation (NVDA) has been at the forefront of advancing AI through its graphics processing units (GPUs). These GPUs are crucial for training large language models (LLMs) such as OpenAI’s ChatGPT, leading to outstanding growth in the company’s revenue and earnings.
As a result, NVDA’s stock has surged nearly 148{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} over the past six months and is up more than 205{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} over the past year. Nvidia stock’s exceptional performance lifted its market capitalization above $3 trillion, making it the second-most valuable company in America.
However, another leading semiconductor company, Micron Technology, Inc. (MU), known for its innovative memory and storage solutions, is also experiencing remarkable growth due to rapid AI adoption.
Let’s explore how the ongoing AI boom powers Micron’s impressive growth and assess if it could outpace Nvidia in the memory chip market.
Micron’s Solid Third-Quarter Financials and Optimistic Outlook
MU posted revenue of $6.81 billion for the third quarter that ended May 30, 2024, surpassing analysts’ expectations of $6.67 billion. That compared to $5.82 billion for the previous quarter and $3.75 billion for the same period last year. Robust AI demand and robust execution enabled Micron to drive exceptional revenue growth, exceeding its guidance range for the third quarter.
Micron’s non-GAAP gross margin was $1.92 billion, compared to $1.16 billion in the prior quarter and negative $603 million in the third quarter of 2023. Its non-GAAP operating income came in at $941 million, versus $204 million in the previous quarter and negative $1.47 billion for the same period of 2023.
Furthermore, the company posted non-GAAP net income and earnings per share of $702 million and $0.62, compared to net loss and loss per share of $1.57 billion and $1.43 in the same quarter last year, respectively. Its EPS surpassed the consensus estimate of $0.53.
MU’s adjusted free cash flow was $425 million, compared to negative $29 million in the previous quarter and negative $1.36 billion for the same quarter of 2023. The company ended the quarter with cash, marketable investments, and restricted cash of $9.22 billion.
“We are gaining share in high-margin products like High Bandwidth Memory (HBM), and our data center SSD revenue hit a record high, demonstrating the strength of our AI product portfolio across DRAM and NAND. We are excited about the expanding AI-driven opportunities ahead, and are well positioned to deliver a substantial revenue record in fiscal 2025,” said Sanjay Mehrotra, Micron Technology’s President and CEO.
For the fourth quarter of 2024, Micron expects revenue of $7.60 billion ± $200 million. The midpoint ($7.60 billion) of its revenue guidance range represents an approximately 90{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} rise from the same period last year. Its non-GAAP gross margin is anticipated to be 34.5{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} ± 1{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38}. In addition, the company projects its non-GAAP earnings per share to be $1.08 ± 0.08, a turnaround from a loss of $1.07 per share in the previous year’s quarter.
Vital Role in the AI Ecosystem
MU’s success in the AI ecosystem is primarily driven by its high-bandwidth memory (HBM) chips, integral to high-performance computing (HPC), GPUs, AI, and other data-intensive applications. The chips provide fast and efficient memory access for processing large volumes of data quickly.
Micron sold $100 million of its HBM3E chips in the third quarter alone. Further, the company anticipates its HBM3E revenue to escalate from “several hundred million dollars” in fiscal 2024 to “multiple billions” for fiscal 2025.
Earlier this year, the company started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs.
Moreover, Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications. In June, MU sampled its next-gen GDDR7 graphics memory for AI, gaming, and HPC workloads. Leveraging Micron’s 1? (1-beta) DRAM technology and advanced architecture, the GDDR7 delivers 32 Gb/s high-performance memory in a power-optimized design.
On May 1, the company reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the growing demands for rigorous speed and capacity of memory-intensive Gen AI applications. Powered by Micron’s 1? technology, the 128GB DDR5 RDIMM memory offers over 45{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} greater bit density, up to 22{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} improved energy efficiency, and up to 16{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} reduced latency over competitive 3DS through-silicon via (TSV) products.
AI-Driven Demand in Smartphones, PCs, and Data Centers
AI drives strong demand for memory chips across various sectors, including smartphones, personal computers (PCs), and data centers. In its latest earnings conference call, Micron’s management pointed out that AI-enabled PCs are expected to feature 40{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} to 80{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} more DRAM content than current PCs and larger storage capacities. Similarly, AI-enabled smartphones this year carry 50{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} to 100{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} more DRAM than last year’s flagship models.
These trends suggest a bright future for the global memory chips market. According to the Business Research Company report, the market is expected to reach $130.42 billion by 2028, growing at a CAGR of 6.9{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38}.
Micron’s Competitive Edge Over Nvidia and Attractive Valuation
Despite NVDA’s expected revenue jump from $60.90 billion in the fiscal year 2023 to around $120 billion this year, MU is projected to outpace Nvidia’s growth in the following year. Micron’s revenue could increase by another 50{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} year-over-year in the next fiscal year, outperforming Nvidia’s forecasted growth of 33.7{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38}.
In terms of non-GAAP P/E (FY2), MU is currently trading at 13.76x, 60.9{3da602ca2e5ba97d747a870ebcce8c95d74f6ad8c291505a4dfd45401c18df38} lower than NVDA, which is trading at 35.18x. MU’s forward EV/Sales and EV/EBITDA of 5.98x and 16.44x are lower than NVDA’s 26.04x and 40.56x, respectively. Also, MU’s trailing-12-month Price to Book multiple of 3.28 is significantly lower than NVDA’s 64.15.
Thus, Micron is a compelling investment opportunity for those seeking exposure to the AI-driven memory chip market at a more reasonable price.
Bottom Line
MU is experiencing significant growth driven by the AI boom, with impressive third-quarter financials and a strong outlook for upcoming quarters. The company’s strategic positioning in the AI-driven memory chip market, especially its HBM3E chips, is vital for high-performance computing and data-intensive applications. It has enabled Micron to capitalize on the surging AI demand across various sectors, including smartphones, PCs, and data centers.
On June 27, Goldman Sachs’ analyst Toshiya Hari maintained a Buy rating on MU shares and raised the price target to $158 from $138. Goldman Sachs’ stance indicates strong confidence in Micron’s long-term prospects, particularly with the expansion of AI computing capabilities and its strategic initiatives in the memory market.
Moreover, Rosenblatt Securities reiterated its Buy rating on Micron Technology shares, with a steady price target of $225. The firm’s optimism is fueled by expectations of solid financial performance surpassing analysts’ estimates, propelled by advancements in AI and HBM developments.
Compared to Nvidia, Micron offers solid growth potential at a more reasonable valuation. Despite Nvidia’s dominant position in the AI and data center segment and exceptional stock performance, Micron’s revenue growth rate is projected to outpace Nvidia’s in the following year, driven by its expanding AI product portfolio and increasing market share in high-margin memory products.
For investors seeking exposure to the AI revolution, Micron presents a compelling opportunity with its solid financial performance, innovative product offerings, and competitive edge in the memory chip market.