0 references
Variance analyses for kernel regressors with nested reproducing kernel hilbert spaces
2012Vol. nc95 8, pp. 2001–2004
Abstract
Learning based on kernel machines is widely known as a powerful tool for various fields of information science including signal processing such as function estimation from finite sampling points. One of central topics of kernel machines is model selection, especially selection of a kernel or its parameters. In our previous works, we investigated the generalization error of a model space itself corresponding to a selected kernel in kernel regressors. In this paper, we discuss the generalization error in a model space corresponding to a selected kernel in kernel regressors; and prove that the variance of a learning result is reduced when we adopt a kernel corresponding to a larger reproducing kernel Hilbert space.
Related Papers
- → Kernel self-optimization learning for kernel-based feature extraction and recognition(2013)16 cited
- → FKNDT: A Flexible Kernel by Negotiating Between Data-dependent Kernel Learning and Task-dependent Kernel Learning(2020)1 cited
- → A new compositional kernel method for multiple kernels(2010)
- → An element-wise kernel learning framework(2022)