Date of Award

2019

Publication Type

Master Thesis

Degree Name

M.Sc.

Department

Computer Science

First Advisor

Ziad Kobti

Keywords

Cultural Algorithms, Deep Neural Networks, Evolutionary Pruning

Rights

info:eu-repo/semantics/openAccess

Creative Commons License

Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Abstract

The success of Deep Neural Networks (DNN) in classification is accompanied by a drastic increase in weight parameters which also increases the computational and storage costs. Pruning of DNN involves identifying and removing redundant parameters with little or no loss of accuracy. Layer-wise pruning of weights by their magnitude has shown to be an efficient method to prune neural networks. However, finding the optimal values of the threshold for each layer is a challenging task given the large search space. To solve this problem, we use multi population cultural algorithm which is an evolutionary algorithm that takes advantage of knowledge domains and faster convergence and is used in many optimization problems. We experiment it on LeNet-style models and measure the level of pruning through the pruning ratio. Results show that our method achieves the best pruning ratio (864 on LeNet5) compared with some state-of-the-art DNN pruning methods. By removing redundant parameters, the computational and storage costs are reduced significantly.

Share

COinS