New Non-Volatile Memory Technologies and Neuromorphic Computing
Citations Over Time
Abstract
Picture and audio recognition are two examples of machine learning applications that need a lot of data flow to train or categories quickly. There is a conventional von Neumann architecture still used in computer systems, which separates the computation and memory units and connects them through a data bus. Contrary to the needs of machine learning, Von-Neumann bottlenecks restrict bandwidth and increase power consumption. Using new non-volatile memory technologies, neuromorphic computing may be able to address these issues. The Von Neumann bottleneck is avoided when computing is done in a manner similar to how the human brain concurrently processes a large quantity of information. Neuromorphic computing still has a lot of unanswered problems and concerns. Three key aspects of neuromorphic computation are addressed in this article. With the most recent neuromorphic system ideas in mind, we'll look forward to seeing what's possible in the near future. System, architecture, circuit, and device simulators for this new computing paradigm are all being investigated. Neuromorphic designs' hardware security flaws may also be traced back to recent progress in the CMOS field.
Related Papers
- → All-memristive neuromorphic computing with level-tuned neurons(2016)126 cited
- → From von neumann, John Atanasoff and ABC to Neuromorphic computation and the NeuCube spatio-temporal data machine(2016)8 cited
- → Neuromorphic Computing: Review of Architecture, Issues, Applications and Research Opportunities(2022)3 cited
- → From von Neumann Machines to Neuromorphic Platforms(2018)3 cited
- → 2T1M Neuromorphic Synapse with Pt-Hf-Ti Memristor Model(2022)1 cited