SOM classifier

From WandoraWiki
(Difference between revisions)
Jump to: navigation, search
Line 26: Line 26:
 
  d: [ 0, 1, 0, 0, 0 ]
 
  d: [ 0, 1, 0, 0, 0 ]
  
When training vectors are ready, Wandora creates a neuron matrix. Matrix is a two dimensional plane with neuron cells. Each neuron includes a random vector with equal dimension to training vectors. In our example this dimension is 5. Then Wandora trains the matrix with training vectors. When training ends, the neuron matrix contains transformed neurons adapted to training vectors. Due to the training method, nearby neurons are more similar than distant neurons.
+
When training vectors are ready, Wandora creates a neuron matrix. Matrix is a two dimensional plane with neuron cells. Each neuron includes a random vector with equal dimension to training vectors. In our example this dimension is 5. Then Wandora trains the matrix with training vectors. When training ends, the neuron matrix contains adapted neurons. Due to the training method, neurons resemble training vectors and nearby neurons are generally more similar than distant neurons.
  
 
Finally Wandora opens up a visualization of the neuron matrix and places grouping topics to the matrix cell that best matches the training vector. Visualization allows the user to create new associations grouping similar topics i.e. topics near each other for example.
 
Finally Wandora opens up a visualization of the neuron matrix and places grouping topics to the matrix cell that best matches the training vector. Visualization allows the user to create new associations grouping similar topics i.e. topics near each other for example.

Revision as of 17:25, 29 August 2008

SOM (Self Organizing Map) classifier generates two dimensional artificial neural network and trains the network with given associations. As a consequence topics with similar associations end up to locate near each other in the neural network. After training Wandora user can view, analyze, and create new association using the neural network. For example, a grouping association can be created to represent topic similarity. Consider Wandora's SOM classifier as an experimental feature.

To start with SOM classifier select large number of binary associations and select SOM classifier... in context of your selection. Classifier requests first grouping role. Grouping role players specify labels for the training vectors. Lets say user has selected eight binary associations

a-q
a-e
a-t
b-e
b-l
c-q
c-o
d-e

User selects first player as the grouping role. Training vector labels are a, b, c, and d. Training vectors for this selection are:

a: [ q=1, e=1, t=1, l=0, o=0 ]
b: [ q=0, e=1, t=0, l=1, o=0 ]
c: [ q=1, e=0, t=0, l=0, o=1 ]
d: [ q=0, e=1, t=0, l=0, o=0 ]

in other words

a: [ 1, 1, 1, 0, 0 ]
b: [ 0, 1, 0, 1, 0 ]
c: [ 1, 0, 0, 0, 1 ]
d: [ 0, 1, 0, 0, 0 ]

When training vectors are ready, Wandora creates a neuron matrix. Matrix is a two dimensional plane with neuron cells. Each neuron includes a random vector with equal dimension to training vectors. In our example this dimension is 5. Then Wandora trains the matrix with training vectors. When training ends, the neuron matrix contains adapted neurons. Due to the training method, neurons resemble training vectors and nearby neurons are generally more similar than distant neurons.

Finally Wandora opens up a visualization of the neuron matrix and places grouping topics to the matrix cell that best matches the training vector. Visualization allows the user to create new associations grouping similar topics i.e. topics near each other for example.

Additional notes

  • Different runs with same source data may result different SOMs due initial random vectors in neurons matrix.
  • You probably get best results if training vectors are not orthogonal.
Personal tools