We take for granted the vast computing power of our brains. But scientists are still trying to bring computers to the level of the brain.
That's how we ended up in artificial intelligence (or AI) algorithms that learn through virtual neurons - the neural network.
Now, a team of engineers has taken another step toward emulating the computers in our skulls: they've built a physical neural network, with circuitry that looks even more like neurons. When they tested an AI algorithm on the new type of circuit, they found that it worked just as well as conventional neural networks already in use. But the new integrated neural network systemcompleted the task with 100 times less energy than a conventional AI algorithm.
If these new neuron-based circuits take off, artificial intelligence researchers could soon do a lot more computing with a lot less energy. Like using a tinny to communicate with a real phone, computer chips and neural network algorithms simply speak different languages and, as a result, work more slowly. But in the new system, the hardwareand the software were created to work perfectly together. Thus, the new AI system completed tasks much faster than a conventional system, without any drop in accuracy.
This is a step forward from previous attempts to create neural networks based on silicon. Generally, artificial intelligence systems based on these types of neuron-inspired chips don't work as well as conventional artificial intelligence. But the new research modeled two types of neurons: one geared toward fast computations and another designed to storelong-term memory, the researchers explained to the MIT Technology Review .
There are good reasons to be skeptical of any researcher who claims that the answer to artificial intelligence and true consciousness is to recreate the human brain. That's because, fundamentally, we know very little about how the brain works. And chances are there are many things in our brains that a computer would find useless.
But still, the researchers behind the new artificial neural hardware were able to glean important lessons from how our brains work and apply it to computer science. In that sense, they figured out how to increase artificial intelligence by selecting what our brains have to offer without overwhelming themselves trying to reconstruct the whole damn thing.
As technology sucks up more and more energy, the hundred-fold improvement for energy efficiency in this AI system means that scientists will be able to accomplish big questions without leaving such a large footprint on the environment.
Source : Futurism