What is it about?
In recent years, convolution neural networks have made remarkable progress in computer vision. These networks have a large number of parameters, which are required to be limited in resource-constrained environments. To solve this problem, various lightweight methods have been proposed, among which network pruning is known as a combinatorial optimization problem. Heuristic algorithms are known to be more efficient than exhaustive search in solving such combinatorial optimization problems. Therefore, this paper uses a surrogate-model-assisted genetic algorithm (SMA-GA) for network pruning. The DenseNet-BC (k=12) model was used as the baseline model.
Featured Image
Photo by Trevor Vannoy on Unsplash
Why is it important?
A multi-dimensional encoding scheme is used for the genetic algorithm, and the fitness value was approximated through a surrogate model to explore a more extensive search space. On the CIFAR-10 and CIFAR-100, the error rate and the number of parameters resulting from SMA-GA were compared with the baseline model. For CIFAR-10, the number of parameters was reduced by up to 36.25%, and the error rate was reduced by 0.39%. For CIFAR-100, the number of parameters was reduced by up to 22.5%, and the error rate was reduced by 0.91%. These results demonstrate that the proposed method effectively reduces the number of parameters with a negligible impact on the error rate.
Read the Original
This page is a summary of: Efficient Pruning of DenseNet via a Surrogate-Model-Assisted Genetic Algorithm, July 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3638530.3654409.
You can read the full text:
Contributors
The following have contributed to this page