Semantic Workflow Encoding Using Vector Symbolic Architectures Chris Simpkin Cardiff University Ian Taylor Cardiff University Graham Bent IBM Research UK Geeth De Mel IBM Research UK Swati Rallapalli IBM Research US
Project 5 - Instinctive Analytics in a Coalition Environment ‘Brain Like’ Distributed Analytic Processing Self Organising Distributed for Situation Understanding Analytics Declarative Approach for Distributed Analytics
Decentralized Microservice Workflows for Coalition Environments Motivation • Symbolic Vector Representations • Service Workflows as Symbolic Vectors • Example of decentralized workflow using symbolic vectors • Next steps towards the DAIS vision of a “distributed • coalition intelligence” (‘brain’)
Motivation Objective A common methodology to describe and orchestrate decentralized services to support mission critical data analytics workflow Challenges Obtaining a stable endpoint to deploy the service manager is • impractical—if not impossible—due to the variable network connectivity associated with mobile endpoints (e.g., unmanned autonomous systems); High latency and cost associated with communication • Poor infrastructure, especially absent back-end connectivity. • Consequently, in dynamic environments, a new class of workflow methodology is required—i.e., a workflow which operates in a decentralized manner.
Intelligent Distributed Analytic Compositions Critical : services that are self-describing and that are discoverable Semantic Pointer Architecture (SPA) Tensor Binding Images credit: Chris Eliasmith John loves J Mary Random Permutation Images credit: Pentti Kanerva Hypothesis : Can we use SPA to enable services and data to be self-describing, and compose needed workflows to satisfy requirements?
Semantic Vectors for cognitive Services Why Semantic Vectors for Services? Semantic vectors capable of supporting a large range of cognitive tasks: Semantic Composition • Representing meaning and order • Analogical mapping • Reasoning • Semantic Vectors are highly resilient to noise (CEMA) and are are good candidates for broadcast communication using HDMAC protocol Semantic vectors have neurologically plausible analogues which may be exploited in future distributed cognitive architectures
Hyperdimentional Binary Symbolic Vectors Hypercube of n bit semantic vectors with 2 n possible vectors (semantic concepts) s = n -1 Distribution of hamming distance for pairs of randomly selected vectors Image credit: Chris Eliasmith Hamming Distance Large random binary vectors (n ~ 10000 bits) can be used to create symbolic vectors. Using simple binding operators, sequences of symbol vectors to be represented in a single vector (Semantic Pointer Vector). Sequence Binding Sequence Unbinding A-> B-> C-> D-> E Pi Unbinding Pi Binding Π -1 V = A + π 1 B + π -2 C + π 3 E + π 4 B = A + noise V = π 1 A + π 2 B + π 3 C + π 4 E + π 5 B where π is a random permutation of A XOR Uninding XOR Binding π P 0 XOR V = A + noise V = P 0 . A + P 1 . B + P 2 . C + P 3 . D + P 4 . E where P . A = π 0 P XOR B
Binary Symbolic Vector Capacity Due to the properties of high dimensional space, vector A will be undecodable when the upper bound of the density of noise vector N approaches the lower. For n = 10,000 and threshold = 10 -6 Vector Capacity = 89 symbols
Unbinding of Semantic Vectors Vector pi unbinding of sequence of symbols using circular buffer
Recursive Binding & Unbinding of Semantic Vectors A = T + p0 0 .B1 1 + p0 0 .p1 0 .B2 2 + … A A 1 = (p0 0 .A) -1 = (T) + B1 0 + p1 -1 .B2 1 + … B1 = T + p0 0 .C1 1 B1 B2 B3 B4 + p0 0 .p1 0 .C2 2 + (p1 -1 . A 1 ) -1 … (p2 -2 . A 2 ) -1 B 1 =(p0 0 .B1) -1 C1 = T + p0 0 .D2 1 + p0 0 .p1 0 .D5 2 + … p1 -1 p0 -2 .T -2 (T) = p0 -1 .T -1 ; C1 C2 C3 C4 Allows next multiplier to be calculated (p1 -1 . B 1 ) -1 C 1 = (p0 0 .C1) -1 D1 D2 D5 Dn • Semantic Pointer Vectors are built from lower semantic levels to higher levels • Semantic Pointer Vectors are self describing and can be compared with each other • Semantic Pointer Vectors can then be recursively unbound by vector unicast/broadcast from highest level
Alternative Service Selection Using Semantic Similarity A A 1 = (p0 0 .A) -1 = (T) + B1 0 + p1 -1 .B2 1 + … B1 = T + p0 0 .C1 1 + p0p1 1 .C2 2 + … B1 B2 B3 B4 (p1 -1 . A 1 ) -1 (p0 0 .B1) -1 C1 = T + p0 0 .D2 1 + p0 0 .p1 0 .D5 2 + … C1 C2 C3 C4 (p1 -1 . B 1 ) -1 C2’ (p0 0 .C1) -1 D1 D2 D5 Dn • Since semantic vectors are self describing and can be compared with each other alternative semantically similar services are selected when best choice not available.
Anticipatory Self Provisioning A Look ahead B1 B2 B3 B4 • If services can communicate (e.g. multicast/broadcast) then services at different levels can unbind vectors from above and anticipate when they are going to be invoked. • Services can monitor the progress of services at level below and determine progress of the workflow relative to themselves
Hamlet -a Distributed Workflow Example number of words | 29770 words number of unique words | 4620 words
Interactions at Scene Level
Hamlet as a distributed workflow Hamlet H = T + p0 0 .A1S1 1 + p0 0 .p1 0 .A1S2 2 + … H 1 = (p0 0 .H) -1 A1S1 = T + p0 0 .BS1 1 + A1S1 A1S2 A5S2 p0.p1 0 .FS1 2 + … (p1 -1 . H 1 ) -1 (p0 0 .A1S1) -1 BS1 = p0 0 .w2 1 + BS1 FS1 BS2 MSn p0 0 p1 0 .w4 2 + … (p1 -1 . A1S1 1 ) -1 (p0 0 .FS1) -1 (p0 0 .BS1) -1 w4 w2 w1 stand there ney me who’s answer We have demonstrated how services can learn the play and then recursively unbind in response to the transmission of the Hamlet vector.
Complex Workflows Pegasus Workflow Generator DAX Workflow Montage_20 Workflow
Complex Workflows Pegasus Workflow Generator Workflow Vector Generation Recruitment Phase VSA Vector Generation
Complex Workflows Connect Nodes Phase Pegasus Workflow Generator VSA Vector Generation
Complex Workflows Workflow Execution Similarly
Complex Workflows Workflow Execution S1 Similarly
Complex Workflows Workflow Execution S1 Similarly
Complex Workflows Workflow Execution S2 Similarly
Complex Workflows Workflow Execution S2 Similarly
Complex Workflows Workflow Execution S3 Similarly
Complex Workflows Workflow Execution S3 Similarly
Complex Workflows Workflow Execution S5 Similarly
Complex Workflows Workflow Execution Similarly
Complex Workflows Workflow Execution Similarly
Complex Workflows Workflow Execution S21 Similarly
Complex Workflows Workflow Execution S22 Similarly
Complex Workflows Workflow Execution Similarly
Complex Pegasus Workflows
Distributed Workflow in Networks CORE Emulation Evaluation Framework Decentralized Workflow
Acknowledgement This research was sponsored by the U.S. Army Research Laboratory and the U.K. Ministry of Defence under Agreement Number W911NF-16-3-0001. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the U.S. Army Research Laboratory, the U.S. Government, the U.K. Ministry of Defence or the U.K. Government. The U.S. and U.K. Governments are authorized to reproduce and distribute reprints for Government purposes notwithstanding any copy-right notation hereon.
Recommend
More recommend