Synaptic Neural Networks--Supervised Learning Without Weights

Author

Chris Caswell

Date of Award

2009

Document Type

Thesis

Degree Name

Bachelors

Department

Natural Sciences

First Advisor

Henckell, Karsten

Keywords

Computer Science, Artificial Neural Networks, Artificial Intelligence

Area of Concentration

Computer Science

Abstract

This thesis proposes a novel model of artificial neural networks wherein the notion of synaptic weights is removed and a Gaussian activation function is used. The new, sporadically-connected Neural Networksare trained by a probabilistic extension of the famous error-backpropagationalgorithm and tested using, a set of standard benchmarking rules and problem sets. Despite its simplicity, the proposed model is shown to be capable of generalizing on real-world data with a performance comparable to that of a Gaussian-activated weighted network. We then explore the possible advantages the model might have for efficient FPGA hardware implementations and the biological relevance it has with the current understanding and modeling of neuroplasticity.

Rights

This bibliographic record is available under the Creative Commons CC0 public domain dedication. The New College of Florida, as creator of this bibliographic record, has waived all rights to it worldwide under copyright law, including all related and neighboring rights, to the extent allowed by law.

This document is currently not available here.

Share

COinS