Date of Award


Publication Type

Master Thesis

Degree Name



Computer Science

First Advisor

Ziad Kobti


Cultural Algorithms, Deep Neural Networks, Evolutionary Pruning




The success of Deep Neural Networks (DNN) in classification is accompanied by a drastic increase in weight parameters which also increases the computational and storage costs. Pruning of DNN involves identifying and removing redundant parameters with little or no loss of accuracy. Layer-wise pruning of weights by their magnitude has shown to be an efficient method to prune neural networks. However, finding the optimal values of the threshold for each layer is a challenging task given the large search space. To solve this problem, we use multi population cultural algorithm which is an evolutionary algorithm that takes advantage of knowledge domains and faster convergence and is used in many optimization problems. We experiment it on LeNet-style models and measure the level of pruning through the pruning ratio. Results show that our method achieves the best pruning ratio (864 on LeNet5) compared with some state-of-the-art DNN pruning methods. By removing redundant parameters, the computational and storage costs are reduced significantly.