Micron falls 3.4% as Google's TurboQuant reduces AI memory needs
Technology

Micron falls 3.4% as Google's TurboQuant reduces AI memory needs

Algorithm slashes memory requirements by 6x, raising questions about future chip demand despite sold-out HBM supply

Micron Technology shares fell 3.4% on Wednesday, extending the memory chipmaker's losing streak to five consecutive days, after Google unveiled an algorithm that dramatically reduces memory requirements for artificial intelligence models.

The Boise, Idaho-based company's stock closed at $382.09, bringing its week-to-date decline to more than 8%. The broader memory sector also felt the pressure, with Samsung, SK Hynix and Western Digital experiencing similar sell-offs as investors weighed the implications of Google's new technology.

Google Research announced its TurboQuant algorithm suite on Wednesday, a software-only breakthrough that achieves an average 6x reduction in Key-Value cache memory usage for Large Language Models. The technology, released publicly and free for enterprise usage, can also deliver an 8x performance improvement in computing attention logits while cutting enterprise implementation costs by more than 50%.

TurboQuant addresses what Google researchers call the "memory tax" of modern AI through a two-stage mathematical approach. The first stage, PolarQuant, converts vectors to polar coordinates, while the second applies a 1-bit error correction layer using the Quantized Johnson-Lindenstrauss algorithm. The result is compression of KV caches to as little as 3 bits per value from the standard 16, with no measurable loss in accuracy.

The announcement sparked fears that more efficient AI systems could require fewer memory chips, potentially dampening demand for DRAM and NAND products at a time when Micron and its competitors are betting heavily on an AI-driven boom. Memory supplier stock prices immediately trended downward as markets digested the news, according to VentureBeat's analysis of the announcement.

However, some analysts counter with the Jevons Paradox argument—that increased efficiency often leads to greater overall consumption. More efficient AI models could accelerate broader adoption across industries, potentially driving total memory demand higher even as individual systems require less.

The sell-off comes despite Micron's exceptionally strong fundamentals. The company reported Q2 FY26 revenue of $23.9 billion and earnings per share of $12.20, significantly surpassing consensus estimates. Management issued optimistic guidance for Q3, projecting sales of $33.5 billion with EPS of $19.15.

Perhaps more importantly, Micron's entire High-Bandwidth Memory production for 2026 is reportedly sold out under binding contracts. Global HBM spending is projected to increase 58% year-over-year in 2026, reaching $54.6 billion, driven by unprecedented demand for AI infrastructure.

Analysts remain broadly bullish on Micron despite the recent volatility. The consensus rating is "Buy" with an average price target of $524.73, representing more than 37% upside from current levels. The stock currently trades at a forward price-to-earnings ratio of just 6.92, well below the semiconductor sector average.

"The market may be overreacting to a software breakthrough that's still in early deployment stages," analysts noted in recent commentary. "The structural supply constraints in the memory market, combined with AI server memory consumption being 2-3 times higher than traditional servers, suggest demand will remain robust."

Memory chip prices have been rising aggressively since the second quarter of 2025, with an acceleration of 40% or more in the first quarter of 2026. A structural shortage of conventional DRAM and NAND is expected to persist well beyond 2026, extending potentially to 2030, due to HBM prioritization and limited wafer capacity.

For now, investors are grappling with two competing narratives: the near-term uncertainty about whether algorithmic efficiency will temper memory demand, versus the longer-term reality of structural shortages and AI-driven growth that shows no sign of abating. As Google researchers present their findings at upcoming conferences including ICLR 2026 and AISTATS 2026, the market will be watching closely for signals of hyperscaler adoption rates.