Micron's planned $24 billion NAND flash expansion in Singapore will require 400 to 500 power transformers, which is more than double the 100 to 150 units a standard wafer fab typically needs, according to industry sources. This staggering electrical requirement exposes a fundamental constraint that chipmakers have overlooked amid their race to capture AI demand: heavy electrical equipment is fast becoming the limiting factor in global semiconductor manufacturing.
The scale exceeds the annual output capacity of any single Taiwanese transformer manufacturer, forcing Micron and its competitors to source equipment from multiple vendors across the world. What makes this squeeze particularly urgent is timing. Micron's Singapore project, where production is targeted for late 2028, cannot begin operations without this electrical backbone in place. Any delay in transformer delivery translates directly to delayed chip production.
The root cause is straightforward: this level of demand from Micron reflects the power intensity of modern memory fabs tied to AI. HBM production for AI servers has driven every major memory maker into simultaneous expansion, and the electrical infrastructure required to support those fabs is now outpacing the supply chain built to serve it. Samsung and SK Hynix have announced their own massive capacity expansions driven by identical pressures.
Critically, data center operators planning new facilities are in the same queue, competing with semiconductor companies for equipment that takes months to manufacture and deliver. This is not a problem that capital investment can quickly solve. Transformers are not semiconductors; they are industrial equipment with long lead times, and production capacity cannot be rapidly scaled.
The supply-chain strain is already visible in pricing. Major heavy electrical equipment suppliers Fortune Electric and Allis Electric both have implemented price increases of 20% to 30%, driven by the surge in orders and rising costs of copper and other raw materials. More troubling still, some transformer manufacturers have declined to quote on large-scale semiconductor projects entirely, citing an inability to meet the tight timelines and volume requirements.
The strategic implications are significant. Unfortunately, delayed transformer deliveries will likely translate into delayed fabs, which in turn will push back the timelines for memory production that AI buyers are counting on. This creates a cascading problem. Large technology companies planning data centre expansion depend on memory chip delivery schedules; any slip in semiconductor production spreads across the broader AI infrastructure buildout.
Samsung Electronics and SK hynix have also announced their own capacity expansions, all driven by the same demand curve: AI server deployments consuming HBM at volumes that existing production lines cannot satisfy. We're now seeing a synchronized wave of fab construction across three continents as a result, with each project competing for the same pool of heavy electrical equipment and raw materials.
This bottleneck reveals a deeper reality about the infrastructure constraints underpinning the AI boom. Physical systems cannot scale as quickly as capital can be deployed. Semiconductor fabs, data centres, and the electrical grids that power them all face interconnected capacity limits. The second half of the AI race is shifting from chips to power. Massive energy consumption and extreme peak load requirements at data centers have made electricity supply the industry's most binding physical constraint. Interconnection backlogs, hardware shortages (e.g., power transformers), and construction time mismatches mean that simply pouring capital into new power plants cannot resolve shortages in the near term.
Micron has hedged against this uncertainty by securing commitments from multiple manufacturers and ramping up orders years in advance. Still, the company cannot unilaterally solve an industry-wide supply problem. The transformer shortage is not an anomaly; it is a preview of the physical limits that will constrain AI expansion globally unless supply chains catch up.