‘Thermodynamic computer’ can mimic AI neural networks — using orders of magnitude less energy to generate images


Scientists have built a “thermodynamic computer” capable of producing images from random data disturbances, i.e. noise. In doing so, they imitated the generative system artificial intelligence (AI) neural network capabilities – collections of machine learning algorithms modeled on the brain.
Above absolute zero, the world vibrates with energy fluctuations called thermal noise that manifest as atoms and molecules waving around, the atomic scale flips due to the quantum property that confers magnetism, and so on.
Today’s AI systems – like most other computing systems today – generate images using computer chips in which the energy required to flip bits eclipses the amount of energy contained in random fluctuations of thermal noise, making the noise negligible.
But a new “generative thermodynamic computer” works by exploiting system noise rather than in spite of it, meaning it can perform computational tasks with orders of magnitude smaller than those required by conventional AI systems. The scientists presented their findings in a new study published January 20 in the journal Physical Examination Letters.
Stephen Whitelamscientist at the Lawrence Berkeley National Laboratory’s Molecular Foundry and author of the new study, made an analogy to boats in the ocean. Here, waves play the role of thermal noise, and conventional computing can be compared to an ocean liner that “crushes like it doesn’t care — very efficient, but very expensive,” he said.
If we were to reduce the power consumption of conventional computing to a level comparable to thermal noise, it would be like trying to steer a boat with an outboard motor across the ocean. “It’s much more difficult,” he told Live Science, and harnessing noise in thermodynamic computing can help, like “a surfer harnesses the power of waves.”
Conventional computing works with defined binary bit values: 1 and 0. However, a growing body of research over the past decade has highlighted that you can get better value for money in terms of resources such as electricity consumed to perform a calculation when working with probabilities of values.
Efficiency gains are particularly pronounced for certain types of problems called “optimization” problems, where you want to get the most out of it while investing the least – for example, visiting the most streets to deliver mail while driving the fewest miles. Thermodynamic calculation could be considered as a type of probabilistic calculation that uses random fluctuations in thermal noise to calculate power.
Image generation with thermodynamic calculation
Researchers at the Normal Computing Corporation in New York, who were not directly involved in this image-generating work, have built something akin to a thermodynamic computer, using a network of circuits linked by other circuits, all operating at low energies comparable to thermal noise. The circuits doing the linking could then be programmed to strengthen or weaken the connection they form between the circuits they link – the “node” circuits.
Applying any type of voltage to the system would set a series of voltages at the different nodes, assigning them values that would eventually attenuate as the applied voltage was removed and the circuits returned to equilibrium.
However, even at equilibrium, noise in circuits causes node values to fluctuate in a very specific way determined by the programmed strength of the connections, called coupling forces. As such, the coupling forces could be programmed in such a way that they effectively pose a question that is answered by the resulting equilibrium fluctuations. THE researchers at Normal Computing, they showed that they could program the coupling forces so that the resulting fluctuations in the equilibrium nodes could solve linear algebra.
Although managing these connections provides some control over which question is answered by equilibrium fluctuations in node values, it does not allow changing the type of question. Whitelam wondered whether moving away from thermal equilibrium could help researchers design a computer that could answer fundamentally different types of questions, and whether that would be more practical, since reaching equilibrium can take some time.
While thinking about the kinds of calculations that might be made possible by moving away from equilibrium, Whitelam found himself considering some research around the mid-2010swhich showed that if you took an image and added noise until no trace of the original image was visible, a neural network could be trained to reverse this process and thus recover the image. If you trained it on a series of disappearing images, the neural network would be able to generate a range of images from a random noise starting point, including images from outside the library it was trained on. These diffusion models seemed to Whitelam “a natural starting point” for a thermodynamic computer, diffusion itself being a statistical process rooted in thermodynamics.
While conventional computing works in a way that reduces noise to negligible levels, Whitelam noted, many algorithms used to train neural networks work by adding noise again. “Wouldn’t it be much more natural in a thermodynamic environment where noise is free?” he noted with a conference proceedings.
Borrowing from age-old principles
How things change under the influence of significant noise can be calculated from the Langevin equation, which dates back to 1908. Manipulating this equation can give probabilities for each step in the process of enveloping an image with noise. In a sense, this gives the probability that each pixel will take on the wrong color when an image is subjected to thermal noise.
From there, it is possible to calculate the necessary coupling forces – for example circuit connection forces – to reverse the process, removing the noise step by step. This generates an image – something Whitelam demonstrated in a digital simulation from a library of images containing a “0”, a “1” and a “2”. The generated image can come from the original training database or some sort of guess, and a bonus of imperfections in training means it’s possible to come up with new images that aren’t part of the original dataset.
Ramy ShelbayaCEO of a company producing quantum random number generators, Quantum Dice, which was not involved in the study, called the results “important.” He referred to particular areas where traditional methods are beginning to struggle to meet the ever-increasing demand for more powerful models. Shelbaya’s company produces a type of probabilistic computing hardware using quantum-generated random numbers, and as such he found it “encouraging to see the ever-growing interest in probabilistic computing and the various computing paradigms closely related to it.”
He also pointed to a potential benefit beyond energy savings: “This paper also shows how physics-inspired approaches can provide a clear fundamental interpretation of an area where ‘black box’ models have dominated, providing critical insights into the learning process,” he told Live Science by email.
When it comes to generative AI, recovering three learned digits from noise may seem relatively rudimentary. However, Whitelam pointed out that the concept of thermodynamic calculation is still only a few years old.
“Looking at the history of machine learning and how it was eventually scaled up to bigger, more impressive tasks,” he said, “I’m curious if thermodynamic hardware, even in a conceptual sense, can be scaled in the same way.”



