Date of Award

1994

Degree Type

Thesis

Degree Name

M.Sc.A.

Department

Electrical and Computer Engineering

First Advisor

Loh, Robert Nan K.,

Keywords

Engineering, Electronics and Electrical.

Rights

CC BY-NC-ND 4.0

Abstract

The training and synthesis of multilayer and multi-output feedforward artificial neural networks with hard-limiting neurons is considered in this thesis. Networks with hard-limiting neurons are considered as they are easier to implement using VLSI technology. New results are presented that deal with varying the neuron activation function, accelerating the convergence of training, and the synthesis of neural networks. A steepness factor that allows one to vary the neuron activation function between sigmoidal and hard-limiting properties is used. A function has been developed whereby this factor is changed during the training process according to an exponential function with a negative exponent given by the sum-squared-error. The resultant varying neuron activation function is used in conjunction with the standard backpropagation algorithm to allow the training of neural networks with hard-limiting neurons instead of only ones with sigmoidal properties. A method to accelerate the convergence of the backpropagation algorithm has been derived using the concept of orthogonal vectors during successive iterations. A new expression for updating the weighting coefficients has been derived that contains a momentum function instead of simply a momentum constant. A new synthesis technique is presented for multi-layer, multi-output neural networks that utilizes a generalized Tiling algorithm that has been developed in the thesis.Dept. of Electrical and Computer Engineering. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis1994 .Y815. Source: Masters Abstracts International, Volume: 33-04, page: 1314. Supervisors: Robert Nan K. Loh; W. C. Miller. Thesis (M.Sc.A.)--University of Windsor (Canada), 1994.

Share

COinS