Biologically Inspired Online Learning of Visual Autonomous Driving
Citations Over TimeTop 17% of 2014 papers
Abstract
While autonomously driving systems accumulate more and more sensors as well as highly specialized visual features and engineered solutions, the human visual system provides evidence that visual input and simple low level image features are sufficient for successful driving. In this paper we propose extensions (non-linear update and coherence weighting) to one of the simplest biologically inspired learning schemes (Hebbian learning). We show that this is sufficient for online learning of visual autonomous driving, where the system learns to directly map low level image features to control signals. After the initial training period, the system seamlessly continues autonomously. This extended Hebbian algorithm, qHebb, has constant bounds on time and memory complexity for training and evaluation, independent of the number of training samples presented to the system. Further, the proposed algorithm compares favorably to state of the art engineered batch learning algorithms.
Related Papers
- → New Internet search volume-based weighting method for integrating various environmental impacts(2015)24 cited
- → Hebbian and error-correction learning for complex-valued neurons(2012)7 cited
- → A study of weighting factors of the quadratic performance index(1969)5 cited
- Biologically Realistic Artificial Neural Networks(2020)
- H-Mem: Harnessing synaptic plasticity with Hebbian Memory Networks(2020)