First a few words about my motivation to write this class structure.
Many applications were written for the existing neural network types. Each of those programs is focussing on a special problem to be solved.
So if you want to implement a similar neural net to one already written, you might either try to understand the source code (that means "fun" for many hours) or copy the source code and modify it until it fits your own net's purpose (what is not the best way of developing software).
Due to the fact that most of the existing neural net types have some common components, I thought it would be nice, if I had to implement those components once and may use them later in different applications, like some sort of a construction kit.
I decided to write classes suitable for the Backpropagation Net and the Kohonen Feature Map, because (in my honest opinion) these are the most powerful types of neural networks.
And here is the result of my work:
(You should always keep in mind that this was my first object oriented piece of software, so don't blame me!)
This is not to be read as a class hierarchy, as explained in the previous section. The levels do not represent levels of abstraction.
The figure only shows, which classes are present and what classes have relationships among each other.
Furthermore, it shall demonstrate my primary goal: The integration of common components.
As you can see, for example, the WeightMatrix class and the NeuronLayer class are commonly used by the two net classes (BackpropagationNet and KohonenFeatureMap) although their structure differs in both net types. Object Orientation rules!
The following sections explain the classes in greater detail.