INTERAMINENSE, C. D. O.; http://lattes.cnpq.br/2551570530403518; INTERAMINENSE, Carlos Daniel Oliveira.
Resumo:
The use of Deep Neural Networks (DNN) to solve machine learning problems became common from 2012, in which the AlexNet model won the ImageNet challenge, which required classifying images into a set of a thousand possible categories. Since then, more complex DNN have emerged, with some having billions of parameters. As a result, applying pruning techniques has become a way to reduce the complexity of a DNN, as these techniques aim to remove input model parameters, resulting in a less complex model with accuracies comparable to those of the input model. In this context, the present research proposes a structured pruning technique that considers the causal effect between neurons to decide which ones will be pruned, along with all their connections. The results obtained in this research show that the proposed technique results in models with higher accuracies compared to other pruning techniques investigated in this dissertation, with better time and disk space occupation than the input model.