Similarity Comparisons for Interactive Fine-Grained Categorization
Citations Over TimeTop 10% of 2014 papers
Abstract
Current human-in-the-loop fine-grained visual categorization systems depend on a predefined vocabulary of attributes and parts, usually determined by experts. In this work, we move away from that expert-driven and attribute-centric paradigm and present a novel interactive classification system that incorporates computer vision and perceptual similarity metrics in a unified framework. At test time, users are asked to judge relative similarity between a query image and various sets of images, these general queries do not require expert-defined terminology and are applicable to other domains and basic-level categories, enabling a flexible, efficient, and scalable system for fine-grained categorization with humans in the loop. Our system outperforms existing state-of-the-art systems for relevance feedback-based image retrieval as well as interactive classification, resulting in a reduction of up to 43% in the average number of questions needed to correctly classify an image.
Related Papers
- → Evaluating group-based relevance feedback for content-based image retrieval(2004)26 cited
- → Image retrieval and relevance feedback using peer indexing(2003)15 cited
- → Semantic image retrieval using relevance feedback and reinforcement learning algorithm(2010)3 cited
- → Query-by-sketch image retrieval using relevance feedback(2005)4 cited
- RESEARCH ON MULTI-LEVEL LOG-BASED RELEVANCE FEEDBACK SCHEME FOR IMAGE RETRIEVAL(2012)