AI is driving innovation at lightning speed, but our focus on faster and easier must not overshadow our opportunities to lessen carbon production while optimizing energy efficient learning.
Sustainable innovation is driving GE Research’s game-changing concept, the Entropy Economy. As energy consumption increases globally, we must connect how we optimize energy systems and computational systems. Joint optimization by managing the shared entropy flow is an opportunity to significantly reduce carbon emissions and energy costs while maximizing energy efficient learning.
Did you know? By 2030, 10% to 20% of energy used worldwide is expected to be consumed by computers, including large data centers/high performance computing (HPC). As our appetite for digital grows, so too do our opportunities to innovate and evolve.
Today’s energy systems and computational systems are optimized separately. The energy grid, for example, provides low-cost power everywhere in the world, irrespective of how efficiently that energy is used. Likewise, the CERN sponsored LHC computing grid optimizes use of computational capacity without regard to energy use. There’s missed opportunity in waste heat capture and so much more we can do with reuse. Simply put, our world lacks an efficiency metric for AI.
The Entropy Economy takes a holistic and unique approach to address the predicted exponential rise in energy consumed by compute within the next decade. GE seeks to jointly optimize learning, energy efficiency, and disposition of waste heat through a combination of energy aware machine learning (EAML), grid architectures, and distributed HPC infrastructure.
Optimizing entropy reduction in learning while addressing entropy flow loss through thermodynamic inefficiencies will lessen carbon production, increase energy efficiency, and can ultimately help stabilize the grid.
Scott Evans presented “The Entropy Economy and the Kolmogorov Learning Cycle” at the Symposium on Algorithmic Information Theory and Machine Learning, July 4-5, 2022 at the Alan Turing Institute in London, UK. Watch Scott's presentation.
Energy aware machine learning (EAML) – Key to executing the Entropy Economy is the development of EAML algorithms that enable tradeoffs between HPC throughput, energy consumed, and output quality. Future work will produce EAML algorithms capable of balancing energy loads, learning from wasted/needless entropy flow loss, and adjusting to create ideal energy profiles.
Grid architectures – The second focus of the Entropy Economy is to move information work machines – better known as data centers and HPCs – to where clean, low-cost energy exists through an optimized compute/energy grid architecture. The work here shifts to a more holistic vision of the electric grid that seeks to jointly optimize the power source and information work systems.
Distributed HPC infrastructure – This component considers the distribution of HPCs throughout the grid that could deliver smart loads by making tradeoffs enabled by the EAML algorithms. HPCs can optimize the use of recovered power while simultaneously achieving desired accuracy. Leveraging EAML algorithms can enable dynamic change of numeric precision depending on the available recoverable power. This, in turn, addresses the challenge of desired accuracy defining constraints.
Executed in concert, these three Entropy Economy components are poised to balance the energy load throughout the grid while increasing supercomputer/datacenter AI capacity with the same or less overall carbon production.
“We talk about ‘using’ energy, but doesn’t one of the laws of nature say that energy can’t be created or destroyed? … When we ‘use up’ one kilojoule of energy, what we’re really doing is taking one kilojoule of energy in a form that has low entropy (for example, electricity), and converting it into an exactly equal amount of energy in another form, usually one that has much higher entropy (for example, hot air or hot water). When we’ve ‘used’ the energy, it’s still there; but we normally can’t ‘use’ the energy over and over again, because only low entropy energy is ‘useful’ to us... It’s a convenient but sloppy shorthand to talk about the energy rather than the entropy…”
-- David Mackay, Sustainable Energy — without the hot air
Brochures, Papers & Presentations
Scott EvansPrinciple Scientist Machine LearningArtificial IntelligenceProject Lead
Tapan ShahSenior ScientistMachine Learning
Hao HuangMachine Learning ScientistArtificial Intelligence
Jeff MaskalunasEdison EngineerThermosciences
Achalesh PandeyResearch Leader, Industrial Artificial IntelligenceArtificial Intelligence