What is it about?
We have developed a revolutionary method that quadratically accelerates artificial intelligence (AI) training algorithms. This gives full AI capability to inexpensive computers, and would make it possible in one to two years for supercomputers to utilize Artificial Neural Networks that quadratically exceed the possibilities of today's artificial neural networks. The proposed method, dubbed Sparse Evolutionary Training (SET), takes inspiration from biological networks and in particular neural networks that owe their efficiency to three simple features: networks have relatively few connections (sparsity), few hubs (scale-freeness) and short paths (small-worldness). The work reported in Nature Communications demonstrates the benefits of moving away from fully-connected ANNs (as done in common AI), by introducing a new training procedure that starts from a random, sparse network and iteratively evolves into a scale-free system. At each step, the weaker connections are eliminated and new links are added at random, similarly to a biological process known as synaptic shrinking.
Featured Image
Why is it important?
The striking acceleration effect of this method has enormous significance, as it will allow the application of AI to problems that are not currently tractable due to the vast number of parameters. Examples include affordable personalized medicine and complex systems. In complex, rapidly changing environments such as smart grids and social systems, where frequent on-the-fly retraining of an ANN is required, improvements in learning speed (without compromising accuracy) are essential. In addition, because such training can be achieved with limited computation resources, the proposed SET method will be preferred for the embedded intelligence of the many distributed devices connected to a larger system.
Perspectives
Read the Original
This page is a summary of: Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science, Nature Communications, June 2018, Springer Science + Business Media,
DOI: 10.1038/s41467-018-04316-3.
You can read the full text:
Resources
Contributors
The following have contributed to this page