Sciact
  • EN
  • RU

Distillation of Knowledge in Boosting Models Научная публикация

Журнал Pattern Recognition and Image Analysis
ISSN: 1054-6618 , E-ISSN: 1555-6212
Вых. Данные Год: 2025, Том: 35, Номер: 3, Страницы: 313-318 Страниц : 6 DOI: 10.1134/S1054661825700221
Ключевые слова knowledge distillation, machine learning boosting, overfitting
Авторы Nedel'ko V.M 1
Организации
1 Sobolev Inst Math, Novosibirsk 630090, Russia

Информация о финансировании (1)

1 Институт математики им. С.Л. Соболева СО РАН FWNF-2022-0015

Реферат: The paper explores the possibility of applying the idea of knowledge distillation to the boosting method. The rationale for this approach is that, in many cases, the best forecast quality is achieved in ensembles using trees of excess depth. In these cases, it may be worthwhile to train an ensemble of shallower trees using a deeper model as a "teacher." This makes it possible, in particular, to assess the real "depth" of dependences between variables in a problem, as well as to obtain more visual visualizations of solutions. The study also provides material for understanding the mechanisms of the effectiveness of the knowledge distillation procedure.
Библиографическая ссылка: Nedel'ko V.M.
Distillation of Knowledge in Boosting Models
Pattern Recognition and Image Analysis. 2025. V.35. N3. P.313-318. DOI: 10.1134/S1054661825700221 WOS Scopus РИНЦ
Даты:
Поступила в редакцию: 25 мар. 2025 г.
Принята к публикации: 9 апр. 2025 г.
Опубликована в печати: 23 окт. 2025 г.
Опубликована online: 23 окт. 2025 г.
Идентификаторы БД:
Web of science: WOS:001597069500018
Scopus: 2-s2.0-105019389334
РИНЦ: 83051313
Цитирование в БД: Пока нет цитирований
Альметрики: