Reference-Free Texture Image Retrieval Based on User-Adaptive Psychophysical Perception Modeling
Abstract
Texture image retrieval based on subjective visual descriptions remains a significant challenge due to the “semantic gap”, where conventional Content-Based Image Retrieval (CBIR) methods rely on low-level features or reference images that often diverge from human perception. To bridge this gap, this paper proposes a reference-free, perception-driven retrieval framework that enables users to query textures directly via abstract perceptual attributes. First, we constructed a human-centric perceptual feature space through controlled psychophysical experiments, quantifying 12 explicit texture attributes (e.g., granularity, directionality) using a 9-point Likert scale. Second, addressing the variability in visual sensitivity across user demographics, we developed a user-adaptive mechanism incorporating dual perceptual libraries tailored for art-major and non-art-major groups. Retrieval is formulated as a perception-aligned similarity optimization problem within this normalized space. Experimental evaluations on the Describable Textures Dataset (DTD) demonstrate that our method achieves superior perceptual consistency compared to both handcrafted descriptors (GLCM, LBP, HOG) and deep learning baselines (VGG16, ResNet50). Notably, the framework attained high PAP@3 performance across both user groups, validating its effectiveness in decoding fuzzy human intent without the need for query images. This work provides a robust solution for semantic-based texture retrieval in human–computer interaction scenarios.