
JHVEPhoto/iStock Editorial via Getty Images
Micron Technology (NASDAQ:MU) edged higher Tuesday after the memory maker revealed it had sent sample shipments of its 36GB HBM4 12-High to multiple key customers.
The shipment prompted Wells Fargo to reiterate its Overweight rating and $130 price target, as Micron remains on track to ramp HBM4 volumes by calendar year 2026 and gain market share.
“MU’s HBM4 leverages the company’s 1þ (1-beta) DRAM process and is noted to feature the built-in self-test tech,” said Wells Fargo analysts, led by Aaron Rakers, in a Tuesday investor note. “MU’s HBM4 also features >20% better power efficiency vs the prior gen HBM3E and achieves speeds >2.0TB/s per memory stack and more than 60% better performance over the prior gen HBM3E at 1.2TB/s.”
Wells Fargo said that SK Hynix (OTCPK:HXSCF), which is the current leader in HBM share, announced its HBM4 12-high sample shipments in mid-March, with mass production expected during the second half of 2025.
“Samsung (OTCPK:SSNLF), which has struggled in HBM3/3E, had noted during their 1Q25 earnings call that it targets mass production of HBM4 in 2H25; revenue contributions starting in 2026,” Rakers noted.
“As a reminder, Micron had reported that its HBM revenue surpassed $1.0B during F2Q25 (Feb ’25), growing >50% q/q; MU has consistently reiterated its target of achieving a DRAM-like HBM share (i.e., low-20%) by the end of CY25,” he added.
Micron estimates its HBM total addressable market is more than $35B in 2025. It was $16B in 2024. It is expected to expand to $64B by 2028 and $100B by 2030.
During the first quarter of 2025, SK Hynix had 36% of DRAM market share in revenue, according to Counterpoint Research. Samsung had 34%, and was followed by Micron at 25%.
“Right now the world is focused on the impact of tariffs, so the question is: what’s going to happen with HBM DRAM?” said Counterpoint research director MS Hwang. “At least in the short term, the segment is less likely to be affected by any trade shock, as AI demand should remain strong. More significantly, the end product for HBM is AI servers, which – by definition – can be borderless.”