Computational neuroscientists suggest brains evolved as prediction machines to optimize their energy use Yuichiro Chino (Getty Images)
A growing number of neuroscientists think that our brain is a kind of “prediction machine” that predicts what will happen before it happens, meaning our perceptions are partly hypotheses. Experiments by computational neuroscientists with artificial neural networks (the components of AI algorithms) suggest that brains evolved as prediction machines to optimize their energy use. Life evolved by creating a perfect balance between what it could “compute” and the energy it could expend. Through the intertwining of genes and form, the digital and the analogue, hardware and software, reason and emotion, the universe has created biological intelligence that uses far less energy than our digital computer calculations.
Artificial intelligence evolved in a very different way from biological intelligence, obeying the laws of scientific geopolitics and industrial competition almost as much as those of physics. Pioneers Alan Turing and John von Neumann, inspired by human biology and also by Kurt Goedel’s mathematics beyond logic and algorithms to understand reality, created the first digital computers (with Manhattan Project funding and military money during the “Cold War “) ). Thanks to semiconductor physics, computers expanded their computing capacity because the size of the chips could be reduced. Between 1980 and 2010, the storage and computing power of microprocessors doubled every two years. This led to a separation between the activities of chip (hardware) manufacturers and those of software and algorithm developers. Computer scientists and scientists have got used to thinking only about the algorithm, assuming that it would run on machines capable of calculating anything we throw at them.
But we are reaching the limits of this model. On the one hand, the chips cannot be further miniaturized (the limit of 2 nanometers has already been reached, there is no more room to shrink further). On the other hand, only Taiwan and South Korea know how to make the most advanced chips, creating an uncertain geopolitical situation. But there is another problem, energy consumption, which is becoming another insurmountable obstacle for the fragile globalized production chains. It is estimated that 3% of the world’s total electricity consumption is consumed in data centres, far more than all of the UK’s electricity consumption. Forecasts assume that it will rise to 13% in 2030.
The supercomputers that we use in weather models, medical design, airplanes and cars, etc. they also use a lot, about as much electricity as a city of 10,000 people. For example, the Summit supercomputer at Oak Ridge National Laboratory will produce annual CO2 emissions equivalent to more than 30,000 round-trip flights between Washington and London. A round of training for a powerful AI algorithm (e.g. language translator) costs $4 million in utility bills. A single cryptocurrency transaction consumes the same electricity as a typical family does in a week.
These exorbitant costs limit what can/should be charged. Scientists are trying to improve the situation, but in an uncoordinated way. Although they have something in common, they all look for inspiration in living structures capable of computing with very little expenditure of energy. Algorithm designers are trying to incorporate the brain’s ability to predict the physics of the process, mentioned at the beginning, in order to reduce the number of AI parameters. But the trend of the powerful is not going in this direction: The race to “artificial superintelligence” began in 2020 with the “Open AI” founded by Elon Musk, which revealed GPT-3 with a capacity of 175 billion parameters in the algorithm. Google (1.6 trillion parameters) and the Beijing Academy of Artificial Intelligence (with 1.75 trillion) followed in 2021. However, it is not clear that this path of simply scaling up the algorithm leads to superIA, since the power consumption sets a limit that must not be exceeded.
Some scientists are aware that the only solution is to go back to biology to make progress. As in our brain, the hardware and the algorithm/software must be tightly coupled. One particularly interesting area that is gaining traction is that of “neuromorphic chips”. Neuromorphic designs mimic the architecture of the gelatinous blob in our heads, with processing units placed alongside storage. The researchers use analog computing, which, like real neurons, can process continuous signals. Several analog neuromorphic computers are already operational, two examples in the US are NeuRRAM (which outputs 1,000 times less than a digital chip) and Neurogrid from Brains in Silicon at Stanford. In Europe, IMEC built the world’s first self-learning neuromorphic chip and demonstrated its ability to learn to compose music. It’s not clear how these new systems will reach the real world. The problem is that designing hardware is risky and expensive (it costs 30-80 million dollars and 2-3 years to develop a new chip).
Perhaps it is precisely the geopolitical situation, as was the case with the birth of the first computers, that is giving us the impetus. In China, neuromorphic computing is seen as one of the areas where it can outperform current digital systems, and there are dedicated labs in all leading universities. In the United States, the Office of Digital and Artificial Intelligence of the Armed Forces (CDAO) and other military institutions are already developing and funding the implementation of neuromorphic hardware for combat use. Applications include smart headsets/glasses, drones and robots.
In an unstable world once again threatened by wars, geopolitics could lead us to reinvent computer science and reconnect with Goedel, Turing and von Neumann to overcome their limitations. They knew that reality cannot be simulated in digital algorithms. We return to the reality of physics, which always eludes the total control of human logic, to try to move forward.
you can follow THE AGRICULTURAL TECHNOLOGY on Facebook and Twitter or sign up here to receive our weekly newsletter.