380 likes | 539 Views
Surabhi Gupta ’11 Advisor: Prof. Audrey St. John. Algorithm and associated equations. Path finding Framework using HRR. Roadmap. Circular Convolution Associative Memory Path finding algorithm. Hierarchical environment. Locations are hierarchically clustered. X 1. X 4. j
E N D
Surabhi Gupta ’11 Advisor: Prof. Audrey St. John Algorithm and associated equations Path finding Framework using HRR
Roadmap • Circular Convolution • Associative Memory • Path finding algorithm
Hierarchical environment • Locations are hierarchically clustered X1 X4 j k l a b c X2 X3 Z Y2 Y1 m n o d e f g h i X6 p q r X5
Tree representation • The scale of a location corresponds to its height in the tree structure. • The node of a tree can be directly queried without pointer following • Maximum number of goal searches = height of the tree
Circular Convolution Holographic Reduced Representations
Circular Convolution (HRR) • Developed by Tony Plate in 1991 • Binding (encoding) operation – Convolution • Decoding operation – Involution followed by convolution
Basic Operations • Binding • Merge
Binding - encoding C≁A C≁B
Circular Convolution ( ) • Elements are summed along the trans-diagonals (1991, Plate).
Involution • Involution is the approximate inverse.
Basic Operations • Binding • Merge
Merge • Normalized Dot product
Properties • Commutativity: • Distributivity:(shown by sufficiently long vectors) • Associativity:
Associative Memory Recall and retrieval of locations
Framework X1 X4 j k l a b c X2 X3 Z Y2 Y1 m n o d e f g h i X6 p q r X5
Assumptions • Perfect tree – each leaf has the same depth • Locations within a scale are fully connected e.g. a,b and c, X4, X5 and X6 etc. • Each constituent has the same contribution to the scale location (no bias). X1 X4 a X2 X5 X3 Z Y2 X6 Y1 p
Associative Memory • Consists of a list of locations • Inputs a location and returns the most similar location from the list. What do we store?
Scales • Locations a-r are each2048-bit vectors taken from a normal distribution (0,1/2048). • Higher scales - Recursive auto-convolution of constituents
X1 = Constructing scales X1 + a b c +
Across Scale sequences • Between each location and corresponding locations at higher scales. X1 a b c +
Path finding algorithm Quite different from standard graph search algorithms…
Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal
Retrieving the next scale • If at scale-0, query the AS memory to retrieve the AS sequence. Else use the sequence retrieved in a previous step. • Query the L memory with
Retrieving the next scale • Helllo • Query the L memory with
Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal
Locating the goal • For example:location: • and goal: c
Locating the goal • Goal: p • Not contained in X1 X1 X4 a X2 X5 X3 Z Y2 X6 Y1 p
Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal
Goal not found at Y1 X1 X4 a X2 X5 X3 Z Y1 Y2 p X6
Goal found at Z! X1 X4 a X2 X5 X3 Z Y1 Y2 p X6
Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal
Decoding scales • Same decoding operation
Decoding scales • Using the retrieved scales
Path finding algorithm Start==Goal? Start Move towards the Goal Go to a higher scale and search for the goal If goal not found at this scale If goal found at this scale Retrieve the scales corresponding to the goal
Moving to the Goal X1 X4 j k l a b c X2 X3 Z Y2 Y1 m n o d e f g h i X6 p q r X5
To work on • Relax the assumption of a perfect tree. • Relax the assumption of a fully connected graph within a scale location.
References • Kanerva, P., Distributed Representations, Encyclopedia of Cognitive Science 2002. 59. • Plate, T. A. (1991). Holographic reduced representations: Convolution algebra for compositional distributed representations. In J. Mylopoulos & R. Reiter (Eds.), Proceedings of the 12th International Joint Conference on Artificial Intelligence (pp. 30-35). San Mateo, CA: Morgan Kaufmann.