Skip to main content
  • Home
  • Tech
  • “Operating Profit Up 810%” — What Micron’s Strong Earnings Signal, and Whether It Can Cement Its Edge in HBM Supply

“Operating Profit Up 810%” — What Micron’s Strong Earnings Signal, and Whether It Can Cement Its Edge in HBM Supply

Picture

Member for

1 year 4 months
Real name
Stefan Schneider
Bio
Stefan Schneider brings a dynamic energy to The Economy’s tech desk. With a background in data science, he covers AI, blockchain, and emerging technologies with a skeptical yet open mind. His investigative pieces expose the reality behind tech hype, making him a must-read for business leaders navigating the digital landscape.

Modified

Profit structure shifts toward high-performance memory
HBM triopoly takes shape, boosting bargaining leverage
Final pricing power still anchored in GPU platforms

U.S. memory chipmaker Micron Technology delivered record-breaking results, surprising the market. Most key indicators, including revenue and gross margin, posted their steepest gains on record, while revenue contributions expanded across business segments. With further growth in high-bandwidth memory (HBM) sales expected, additional upside is also anticipated. As competition in next-generation HBM supply intensifies and gaps among major players narrow, analysts note that ultimate pricing power remains concentrated among a small number of companies controlling GPU platforms.

Operating Margin Expected to Rise from 67.6% to 81%

According to industry sources, Micron reported revenue of $23.86 billion and operating profit of $16.135 billion for the second quarter of fiscal 2026, covering the period from November 28 last year to February 26 this year. These figures represent year-over-year increases of 196% and 810%, respectively. Under GAAP, gross margin reached 74.4% and operating margin stood at 67.6%. Diluted earnings per share surged to $12.07 from $1.41 a year earlier, marking more than a sevenfold increase. The data reflects a simultaneous surge across nearly all major profitability metrics.

CEO Sanjay Mehrotra stated during the earnings call, “We achieved record quarterly performance in revenue, gross margin, earnings per share, and cash flow,” attributing the results to record-high revenue across DRAM, NAND flash, and HBM segments. During the same period, Micron’s cloud memory revenue rose 163% year over year to $7.749 billion, while its core data center segment recorded a 211% increase to $5.687 billion.

Mobile client revenue also grew 245% year over year to $7.711 billion, reaching a level comparable to cloud memory. Although overall smartphone and PC shipments are expected to decline slightly this year due to memory supply constraints, the rise of on-device AI is driving a sharp increase in memory capacity per device. Industry estimates suggest that smartphones equipped with 12GB or more of DRAM will account for around 80% of the market this year.

This favorable environment is reflected in Micron’s third-quarter guidance. The company projected revenue of $33.5 billion for fiscal Q3 2026 and forecast a gross margin of 81%. Expected earnings per share were set at $18.90 under GAAP, with an adjusted estimate of $19.15. As revenue sources diversify across segments, profitability indicators are also expected to improve further.

Micron’s HBM4 36GB 12H, to be integrated into Nvidia’s ‘Vera Rubin’ platform/Photo=Micron

Aggressive Capacity Expansion to Boost Supply

High-value memory demand, particularly for HBM, is identified as a key driver of profitability. On March 16, Micron announced that it had begun shipping its 12-layer 36GB HBM4 product designed for Nvidia’s ‘Vera Rubin.’ The unusual decision to specify shipment timing and customer was interpreted as an effort to dispel market speculation that Micron’s HBM4 had failed to meet Nvidia’s performance requirements. The product delivers input/output speeds exceeding 11Gbps and bandwidth above 2.8TB/s, with bandwidth improved by 2.3 times and power efficiency enhanced by more than 20% compared to the previous generation.

The expansion of the product lineup is also accelerating. Micron announced plans to provide customers with samples of a 16-layer 48GB HBM4 product built using 16 stacked 24Gb DRAM chips. This configuration offers a 33% increase in capacity compared to the currently mass-produced 12-layer version. Chief Business Officer Sumit Sadana noted that Micron has worked closely with Nvidia to ensure that computing units and memory scale together from the early stages, suggesting that its supply scope could extend beyond standalone HBM to broader AI server systems.

Supply-side conditions are also supporting the company’s earnings structure. Micron expects DRAM and NAND supply shortages to persist this year and agreed in January to acquire PSMC’s P5 fab in Miaoli County, Taiwan, for $1.8 billion in cash. At the same time, new fab construction and expansions are underway in Idaho and New York in the U.S., Hiroshima in Japan, and Singapore. Based on these investments, Micron plans to expand its HBM4 production capacity to 15,000 wafers per month. Industry estimates place Micron’s total HBM capacity at approximately 55,000 wafers per month, with around 30% allocated to HBM4.

As Micron joins Samsung Electronics and SK hynix in supplying Nvidia, the three-way HBM competition is becoming increasingly defined. Industry observers note that Micron, once seen as lagging in production capacity, has created a turning point through aggressive expansion and faster ramp-up. With product development and production schedules aligned to Nvidia’s platform, major memory makers are now competing under the same criteria. This has led to assessments that “the influence of DRAM suppliers is steadily increasing within a negotiation structure once dominated by CPU companies.”

Proven Ability to Respond to Platform Shifts

However, some analysts caution that Micron’s advantage may not be long-lasting, as final pricing authority remains with Nvidia. In the AI semiconductor market, demand is ultimately driven by Nvidia-designed GPUs and the server systems built around them. While HBM serves as a critical component within this structure, pricing is determined at the system level rather than at the individual component level. Even when memory prices rise, the impact is absorbed into the overall pricing of GPUs and server products.

This structure reflects the broader shift in the semiconductor industry from CPU-centered to GPU-centered architecture. In the past, system performance and pricing were defined around CPUs led by Intel, but since 2022, the expansion of AI workloads has positioned Nvidia GPUs as the dominant benchmark. Today, data center and AI infrastructure are built around GPU designs, with memory selection and capacity determined accordingly. This places memory suppliers in a reactive position, responding to demand rather than generating it.

Changes in competitive dynamics have also unfolded within these constraints. During the early adoption phase of HBM, SK hynix secured an early lead with stacked DRAM technology, followed by rapid catch-up from Samsung Electronics and Micron, narrowing the technological gap. Competition has since shifted toward how precisely companies can meet Nvidia’s performance requirements and production timelines. In the case of HBM4, specifications are defined in alignment with next-generation GPU platforms such as Vera Rubin, with suppliers competing on yield and supply stability within those fixed parameters.

Against this backdrop, Micron’s performance stands out as more than a simple market share gain, highlighting its ability to adapt to platform shifts. The company appears to have secured the production and development capabilities needed to quickly meet evolving specifications as Nvidia refines its next-generation GPU designs. An industry source noted, “Micron, long regarded as a distant third, has reversed the narrative in a short period,” adding that “there is a growing perception that Micron will remain a viable supplier even as Nvidia adjusts its design roadmap.”

Picture

Member for

1 year 4 months
Real name
Stefan Schneider
Bio
Stefan Schneider brings a dynamic energy to The Economy’s tech desk. With a background in data science, he covers AI, blockchain, and emerging technologies with a skeptical yet open mind. His investigative pieces expose the reality behind tech hype, making him a must-read for business leaders navigating the digital landscape.