Sciact
  • EN
  • RU

On the Properties of Bias-Variance Decomposition for kNN Regression Full article

Journal Известия Иркутского государственного университета. Серия: Математика (Bulletin of Irkutsk State University. Series Mathematics)
ISSN: 1997-7670
Output data Year: 2023, Volume: 43, Pages: 110-121 Pages count : 12 DOI: 10.26516/1997-7670.2023.43.110
Tags bias-variance decomposition, machine learning, k-nearest neighbors algorithm, overfitting
Authors Nedel`ko Viktor M. 1
Affiliations
1 Sobolev Institute of Mathematics SB RAS, Novosibirsk, Russian Federation

Funding (1)

1 Sobolev Institute of Mathematics FWNF-2022-0015

Abstract: When choosing the optimal complexity of the method for constructing decision functions, an important tool is the decomposition of the quality criterion into bias and variance. It is generally assumed (and in practice this is most often true) that with increasing complexity of the method, the bias component monotonically decreases, and the variance component increases. The conducted research shows that in some cases this behavior is violated. In this paper, we obtain an expression for the variance component for the kNN method for the linear regression problem in the formulation when the “explanatory” features are random variables. In contrast to the well-known result obtained for non-random “explanatory” variables, in the considered case, the variance may increase with the growth of k.
Cite: Nedel`ko V.M.
On the Properties of Bias-Variance Decomposition for kNN Regression
Известия Иркутского государственного университета. Серия: Математика (Bulletin of Irkutsk State University. Series Mathematics). 2023. V.43. P.110-121. DOI: 10.26516/1997-7670.2023.43.110 WOS РИНЦ OpenAlex
Dates:
Submitted: Dec 5, 2022
Accepted: Jan 23, 2023
Published print: Apr 5, 2023
Published online: Apr 5, 2023
Identifiers:
Web of science: WOS:000954547200008
Elibrary: 50361679
OpenAlex: W4327648033
Citing:
DB Citing
Elibrary 1
OpenAlex 2
Altmetrics: