Sciact
  • EN
  • RU

Differentiable pruning by convolutional projection Full article

Journal Сибирские электронные математические известия (Siberian Electronic Mathematical Reports)
, E-ISSN: 1813-3304
Output data Year: 2025, Volume: 22, Number: 2, Pages: С18-с29 Pages count : 12 DOI: 10.33048/semi.2025.22.C02
Tags Deep Learning, Computer Vision, Pruning.
Authors Chudakov D.S. 1 , Berikov V.B. 2
Affiliations
1 Novosibirsk State University
2 Sobolev Institute of Mathematics

Funding (1)

1 Sobolev Institute of Mathematics FWNF-2022-0015

Abstract: Structured pruning is a widely used technique for reducing the computational complexity of neural networks. The standard pruning process typically consists of two stages: first, identifying and removing the least significant blocks of parameters, and second, fine-tuning the network to restore its original accuracy. However, this separation of stages often results in suboptimal outcomes. In this paper, we propose a novel convolutional neural network pruning procedure that is fully end-to-end and differentiable. Our approach involves projecting convolutional kernels into a reduced number of channels through learnable linear transformations, implemented using 1x1 convolutional layers. Specifically, for each convolutional layer to be pruned, we insert learnable convolutions both before and after the layer. The pre-layer expands the number of channels to the original size, while the post-layer reduces the number of channels. Only these auxiliary convolutions are trained, after which the learned transformations are fused with the original layer to produce a pruned convolutional kernel. The proposed method demonstrates superior accuracy compared to traditional pruning techniques, particularly when removing a substantial number of parameters.
Cite: Chudakov D.S. , Berikov V.B.
Differentiable pruning by convolutional projection
Сибирские электронные математические известия (Siberian Electronic Mathematical Reports). 2025. V.22. N2. P.С18-с29. DOI: 10.33048/semi.2025.22.C02
Dates:
Accepted: Jun 17, 2025
Published print: Nov 6, 2025
Published online: Nov 6, 2025
Identifiers: No identifiers
Citing: Пока нет цитирований
Altmetrics: