We always think that artificial intelligence algorithms do certain tasks not just faster than a person, but much faster. But, paradoxically, the more artificial intelligence is like the human brain, the better it works.
A team of IBM engineers built a physical neural network that literally resembles human neurons. When engineers tested the AI algorithm on a new type of wiring, they found out that it performs tasks with the same efficiency as conventional neural networks, but consumes 100 times less energy.
If such a neural wiring gets accustomed, then soon artificial intelligence can make much more calculations, spending much less energy. The problem is that computer chips and neural network algorithms speak two different languages, and as a result they work more slowly. But in the new system, hard and soft perfectly match each other, and therefore the new AI system completes all tasks faster without reducing accuracy.
This is a step towards the creation of silicon-based neural networks. Typically, such AI systems did not work so well, but in the new study, two types of neurons were modeled: one responsible for rapid computation, the other being created to store long-term information.
There are, of course, certain doubts in the statements of researchers who argue that the answer to creating a working artificial intelligence is the reconstruction of the human brain. By and large, we still do not know how the brain works. And there is a possibility that much of what is in our brain for the computer will be useless.
Nevertheless, a new study shows that the physical structure of human neurons can give a lot of useful for artificial intelligence algorithms, and in modern conditions, energy saving is an important condition for the optimal functioning of any system.