Recursive Recurrent Nets with Attention Modeling for OCR in the Wild
Citations Over TimeTop 1% of 2016 papers
Abstract
We present recursive recurrent neural networks with attention modeling (R2AM) for lexicon-free optical character recognition in natural scene images. The primary advantages of the proposed method are: (1) use of recursive convolutional neural networks (CNNs), which allow for parametrically efficient and effective image feature extraction, (2) an implicitly learned character-level language model, embodied in a recurrent neural network which avoids the need to use N-grams, and (3) the use of a soft-attention mechanism, allowing the model to selectively exploit image features in a coordinated way, and allowing for end-to-end training within a standard backpropagation framework. We validate our method with state-of-the-art performance on challenging benchmark datasets: Street View Text, IIIT5k, ICDAR and Synth90k.
Related Papers
- → AEG: Automatic Exploit Generation(2018)209 cited
- → Automated Crash Analysis and Exploit Generation with Extendable Exploit Model(2022)4 cited
- → AEMB: An Automated Exploit Mitigation Bypassing Solution(2021)5 cited
- Evaluation of Two Host-Based Intrusion Prevention Systems(2005)
- → EBF: Event-Based Filter for Exploit Containment(2021)