A TEMPORAL ATTRACTOR FRAMEWORK FOR THE DEVELOPMENT OF ANALOGICAL COMPLETION
The current model is an adaptation of [1], extending it to draw more complex and abstract analogies. Units are connected by two types of modifiable connections: fast connections which transmit the current activation of the units and slow connections which implement a delay transmitting an earlier activation state of the network. The fast connections drive the network into attractor states corresponding to objects. The slow connections implement transformations between states by pushing the network out of its stable state and into another attractor basin. The fast and slow connections work together to move the network from one attractor state to another in an ordered way. Since the network can learn transformations between more than two objects we suggest how the network could draw analogies involving more than two objects.