Overtraining in back‐propagation neural networks: A CRT color calibration example
Color Research & Application2002Vol. 27(2), pp. 122–125
Citations Over Time
Abstract
Abstract A condition of overtraining a back‐propagation neural network exists when excessive model degrees of freedom are used in network training. A CRT color calibration experiment was done to illustrate methods to avoid an overtrained condition in model development. Cross‐validation, in which the experimental data are split into parameter‐training and independent‐test data sets, is advocated. © 2002 Wiley Periodicals, Inc. Col Res Appl, 27, 122–125, 2002; DOI 10.1002/col.10027
Related Papers
- → Robust Recognition of Conversational Telephone Speech via Multi-condition Training and Data Augmentation(2018)8 cited
- → Effectively Constructing Reliable Data for Cross-Domain Text Classification(2012)2 cited
- → Two Recognition Models for Thai Dancing Data Set(2019)1 cited
- → Empirical comparison of robustness of classifiers on IR imagery(2005)2 cited
- → Generation of Training and Test Data Sets(2001)2 cited