AI Model Mimics Brain Neurons to Reduce Energy Costs
Deployed for AI, e-prop would require only 20 watts, approximately one-millionth the energy a supercomputer uses.
Artificial intelligence models
continue to grow in sophistication and complexity, adding to the need for more
data, computation, and energy.
To help combat increasing energy costs, researchers at TU Graz’s Institute of Theoretical Computer Science have developed a new algorithm, called e-propagation (e-prop for short).
E-prop mimics how neurons send electrical impulses to other neurons in our brain, which massively reduces the amount of energy human brains use, in comparison to machine learning. Deployed for AI, e-prop would require only 20 watts, approximately one-millionth the energy a supercomputer uses.
“E-prop relies on two types
of signals that are abundantly available in the brain, but whose precise role
for learning have not yet been understood: eligibility traces and learning
signals,” said Wolfgang Maass and Robert Legenstein, authors of the e-prop
With this enormous reduction
in energy usage, AI models could theoretically be trained and tested on
computers far less exclusive than the current world-class supercomputers.
Another differential with
e-prop works completely online, rather than storing data centrally, which
improves efficiency by not requiring separate memory for constant data transfer.
The theory of e-prop makes a
concrete experimentally testable prediction: that the time constant of the
eligibility trace for a synapse is correlated with the time constant for the
history-dependence of the firing activity of the postsynaptic neuron,” said the
“It also suggests that the
experimentally found diverse time constants of the firing activity of
populations of neurons in different brain areas are correlated with their
capability to handle corresponding ranges of delays in temporal credit
assignment for learning.”
Discover Past Posts