

The AI trade has hit a new stage: The spotlight is shifting from the chips that process data to the hardware required to store it.
“Right now, we are very early in the memory cycle,” DA Davidson analyst Gil Luria told Yahoo Finance’s Opening Bid. “The progress that we’ve made in AI models has made it so memory is the next frontier. We need a lot more memory in the chips, in the installations, in the servers, in the data center.”
“Companies like Micron, SK Hynix and the part of Samsung that does memory are now becoming increasingly important,” he added.
These comments come as Micron (MU) continues its vertical ascent, with the stock up roughly 240% over the past year. Despite the rally, the valuation remains oddly wacky to some observers, trading at just 9.9 times forward earnings — a steep discount compared to the S&P 500’s (^GSPC) 22 times and Nvidia’s (NVDA) 25 times.
As the memory supercycle takes hold, analysts identified three key names to own for the next leg of the revolution.
Idaho-based Micron has transformed from a cyclical laggard into a cornerstone of the AI server stack. The primary driver is high-bandwidth memory (HBM), a specialized DRAM variant essential for AI training. Micron recently projected the total addressable market for HBM to hit $100 billion by 2028, a staggering 40% compounded annual growth rate.
“This is like getting a Mickey Mantle signed card at a garage sale,” Wedbush analyst Dan Ives said of Micron’s current price. Because HBM production is highly complex, it’s eating up capacity that would otherwise be used for traditional products like smartphones and flash storage, allowing Micron to secure fatter margins and unprecedented pricing power.
While Micron is a domestic favorite, many on the Street view South Korea’s SK Hynix (000660.KS) as the true epicenter of the memory boom. SK Hynix is the primary supplier of HBM to Nvidia, maintaining a market share of roughly 60% as of late 2025.
However, the bull case for SK Hynix is also its biggest risk, as its lead is so pronounced that it faces severe capacity constraints. If SK Hynix cannot meet the surging demand for HBM4 — the next generation of AI memory — it risks losing ground to its rivals in 2026. Still, UBS recently forecast SK Hynix’s HBM4 market share could reach 70% in 2026, as the company plays a key role in Nvidia’s next-gen Rubin platform.
A surprise standout has been Sandisk (SNDK). In the past year, shares have catapulted over 800% following the spin-off from Western Digital (WDC). While most AI discussions center on DRAM, or short-term memory, Sandisk is a leading player in NAND flash, or long-term storage, which is becoming increasingly critical for what Luria describes as “AI at the edge.” Such innovations include devices like robots and autonomous cars, which rely on the technology to process and store data locally.
Despite the supercycle enthusiasm, Luria warned that memory remains a commodity. Unlike Nvidia’s proprietary software ecosystem, memory chips are largely “interchangeable,” which could erode pricing power once the current supply bottleneck eases.
“Nvidia can decide to order more from SK Hynix one year and more from Micron the next year,” Luria cautioned, noting there is “less sustainability” for that type of model.
For now, investors are brushing off the long-term commodity risk in favor of the short-term supply crunch. “From a trading perspective, from a shorter term investment perspective, that doesn’t matter when you’re in a bottleneck,” Luria added.











