Which method to use for optimal structure and function representation of large spiking neural networks: A case study on the NeuCube architecture
Citations Over Time
Abstract
This study analyses different representations of large spiking neural network (SNN) structures for conventional computers and uses the NeuCube SNN architecture as a case study. The representation includes neuronal connectivity and network's and neurons' states during the learning process. Three different structure types, namely adjacency matrix, adjacency list, and edge-weight table, were compared in terms of their storage needs and execution time performance of a learning algorithm, for varying numbers of neurons in the network. Comparative analysis shows that the adjacency list, combined with a backwards indexing mechanism, scales up most efficiently both in terms of performance and of storage requirements. The optimal algorithm was further used to simulate a large scale NeuCube system with 241,606 spiking neurons in a 3D space for prediction and analysis of benchmark spatio-temporal data.
Related Papers
- → Compressed Adjacency Matrices: Untangling Gene Regulatory Networks(2012)53 cited
- → Adjacency Maps and Efficient Graph Algorithms(2022)3 cited
- → A general method for finding principal resonance structures for conjugated systems by semi-random searching of an adjacency matrix(1985)5 cited
- Research of the genetic-clustering algorithm considering the condition of planar adjacency relationship(2005)
- → Scheme Partitioning by Means of Evolutional Procedures Using Symbolic Solution Representation(2017)