In a public statement on December 4th, NVIDIA CEO Jensen Huang pointed out that with the rapid advancement and deepening application of artificial intelligence technology globally, energy supply is gradually becoming a critical bottleneck restricting AI development.
He emphasized that AI computing, especially the training and deployment of large-scale models, is placing unprecedented pressure on power infrastructure, and the current energy system may struggle to support its exponential growth in computing power demands in the long term.
Huang further predicted that to address this challenge, small modular reactors (SMRs) are expected to become an important energy solution to support the operation of artificial intelligence systems within the next decade.
These reactors are characterized by flexible deployment, high safety, and low carbon emissions, and can provide stable, concentrated clean electricity to data centers and AI computing clusters, thereby alleviating the pressure on traditional power grids and promoting the sustainable development of artificial intelligence.
His views reflect a growing consensus in the industry:
the progress of AI is not only about algorithms and hardware, but also closely linked to the transformation and innovation of the energy structure.
In the context of increasingly fierce competition in computing power, whoever can achieve breakthroughs in energy efficiency and supply is likely to gain an advantage in the next stage of intelligent transformation.