Please use this identifier to cite or link to this item:
Title: Modelling a neural network using an algebraic method
Authors: Prompong Sugunnasil
Samerkae Somhom
Watcharee Jumpamule
Natee Tongsiri
Keywords: Multidisciplinary
Issue Date: 1-Jan-2014
Abstract: In this paper, a framework based on algebraic structures to formalize various types of neural networks is presented. The working strategy is to break down neural networks into building blocks, relationships between each building block, and their operations. Building blocks are collections of primary components or neurons. In turn, neurons are collections of properties functioning as single entities, transforming an input into an output. We perceive a neuron as a function. Thus the flow of information in a neural network is a composition between functions. Moreover, we also define an abstract data structure called a layer which is a collection of entities which exist in the same time step. This layer concept allows the parallel computation of our model. There are two types of operation in our model; recalling operators and training operators. The recalling operators are operators that challenge the neural network with data. The training operators are operators that change parameters of neurons to fit with the data. This point of view means that all neural networks can be constructed or modelled using the same structures with different parameters.
ISSN: 15131874
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.

Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.