neural symbolic cognitive reasoning
play

Neural-Symbolic Cognitive Reasoning Artur dAvila Garcez City - PowerPoint PPT Presentation

ICCL Summer School, TU Dresden Dresden, 1-3 September 2010 Neural-Symbolic Cognitive Reasoning Artur dAvila Garcez City University London aag@soi.city.ac.uk Motivation The need for: learning from changes in the environment


  1. ICCL Summer School, TU Dresden Dresden, 1-3 September 2010 Neural-Symbolic Cognitive Reasoning Artur d’Avila Garcez City University London aag@soi.city.ac.uk

  2. Motivation • The need for: learning from changes in the environment reasoning about commonsense knowledge • The need for robustness: controlling the accumulation of errors in uncertain environments • Integrating reasoning and learning: Symbolic systems too brittle (commonsense cannot be axiomatized) Neural networks too complex (modularity, legacy systems, explanation) • Combining the logical nature of reasoning and the statistical nature of learning

  3. Outline • Overview of Neural-Symbolic Cognitive Model • Backpropagation: • worked example • evaluation: cross-validation/ embracing uncertainty • CILP translation algorithm, extraction, applications • Nonclassical CILP: modal, temporal, etc. • Fibring networks (specializations) • Relational / first-order CILP (propositionalization) • Abductive reasoning, attention, emotions, creativity, etc.

  4. Neuroymbolic Computation is... ...interdisciplinary Cognitive Science Logic Machine Learning Probability Theory Computer Science Neural Computation Neuroscience ...related to SRL and ILP but underpinned by neural computation

  5. IET/BCS Turing lecture 2010 (Chris Bishop) 1960s-1980s: Expert Systems (hand-crafted rules) “Within a generation... the problem of creating 'artificial intelligence' will largely be solved” Marvin Minsky 1967 1990's-present: Neural networks, Support vector machines (difficult to include domain knowledge) New AI: Bayesian learning, probabilistic graphical models, efficient inference

  6. One Algorithm for Learning and Reasoning high-level symbolic representations (abstraction, recursion, relations) translations low level, efficient neural structures (with the same, simple architecture throughout)

  7. Neural-Symbolic Learning Systems Connectionist System Learning Inference Machine 3 Explanation Examples 2 4 Neural Network 1 Symbolic Symbolic Knowledge Knowledge 5

  8. Connectionist Inductive Logic Programming (CILP) System A Neural-Symbolic System for Integrated Reasoning and Learning • Knowledge Insertion, Revision (Learning), Extraction (based on Towell and Shavik, Knowledge-Based Artificial Neural Networks. Artificial Intelligence, 70:119-165, 1994) • Real Applications: DNA Sequence Analysis, Power Systems Fault Diagnosis (using backpropagation with background knowledge; test set performance is comparable to backpropagation; test set performance on smaller training sets is comparable to KBANN; training set performance is superior than backpropagation and KBANN)

  9. CILP Translation Algorithm A B θ θ A B r 1 : A ← B,C,~D; W W W r 2 : A ← E,F; θ θ θ h 1 2 h 2 3 h 3 1 r 3 : B ← W W - W W W B C D E F Interpretations based on Holldobler and Kalinke’s translation, but extended to sigmoid neurons (backprop) and hetero-associative networks Holldobler and Kalinke, Towards a Massively Parallel Computational Model for Logic Programming. ECAI Workshop Combining Symbolic and Connectionist Processing , 1994.

  10. CILP Extraction Algorithm 2(a, b, c) → h 1 [a,b,c] b, c → h 1 {1,1,1} a, c → h 1 a, b → h 1 {1,1,-1} {1,-1,1} {-1,1,1} a → h 0 b → h 0 {1,-1,-1} {-1,1,-1} {-1,-1,1} c → h 0 {-1,-1,-1} 1 (a, b, c) → h 0 challenge: efficient extraction of sound, comprehensible symbolic knowledge from large-scale neural networks

  11. Publications Garcez, Zaverucha. The CILP System. Applied Intelligence 11:59-77, 1999. Garcez, Broda, Gabbay. Knowledge Extraction from Neural Nets. Artificial Intelligence 125:153-205, 2001. Garcez, Broda, Gabbay. Neural-Symbolic Learning Systems. Springer, 2002.

  12. CILP extensions • Non-Classical Reasoning • Modal, Temporal, Epistemic, Intuitionistic, Abductive Reasoning, Value-based Argumentation. • New potential applications including temporal logic learning, model checking, software engineering (requirements evolution), etc.

  13. Connectionist Modal Logic (CML) CILP network ensembles, modularity for learning, accessibility relations, disjunctive information W 3 W 2 W 1

  14. Semantics of � and ◊ A proposition is necessary ( � ) in a world if it is true in all worlds which are possible in relation to that world. A proposition is possible ( ◊ ) in a world if it is true in at least one world which is possible in relation to that same world.

  15. Representing � and ◊ p q q W 2 W 3 q p W 1

  16. CML Translation Algorithm Translates modal programs into ensembles of CILP networks, i.e. clauses W i : ML 1 ,...,ML n → MA and relations R(W a ,W b ) between worlds W a and W b , with M in { � , ◊ }. Theorem: For any modal program P there exists an ensemble of simple neural networks N such that N computes P .

  17. Learning in CML We have applied CML to a benchmark distributed knowledge representation problem: the muddy children puzzle (children are playing in a garden; some have mud on their faces, some don’t; they can see if the others are muddy, but not themselves; a caretaker asks: do you know if you’re muddy? At least one of you is) Learning with modal background knowledge offers better accuracy than learning by examples only (93% vs. 84% test set accuracy)

  18. Connectionist Temporal Reasoning A full solution to the muddy children puzzle can only be given by a two-dimensional network ensemble 3 muddy children t 3 at least 2 muddy t 2 at least 1 muddy t 1 Agent 1 Agent 2 Agent 3 Short-term and long-term memory

  19. Publications Garcez, Gabbay, Ray, Woods. Abductive Reasoning in Neural- Symbolic Learning Systems. Topoi 26:37-49, 2007. Garcez, Lamb, Gabbay. Connectionist Modal Logic. TCS, 371: 34-53, 2007. Garcez, Lamb, Gabbay. Connectionist Computations of Intuitionistic Reasoning. TCS, 358:34-55, 2006. Garcez, Lamb. Connectionist Model for Epistemic and Temporal Reasoning. Neural Computation, 18:1711-1738, July 2006.

  20. Combining (Fibring) Networks . Network A . . . Fibred networks . . approximate any polynomial function in fibring function unbounded domains . Network B . . . . .

  21. Relational Learning Inputs presented to P and Q at the same time trigger the learning process in the meta-level Q P X Z X Z Y Y α β γ δ X Z X Z Y Y Experiments on the east-west trains dataset show an improvement from 62% (flat, propositional network) to 80% (metalevel network) on test set performance (leaving one out cross-validation)

  22. FOL ANN (propositionalisation) conform(x,1) conform(x,2) opposite(x,y) mesh(x,1) opposite(y,x) mesh(x,2)

  23. Cognitive Model: Fibred Network Ensembles meta-level relations fibring functions object-level

  24. Publications Garcez, Lamb, Gabbay. Neural-Symbolic Cognitive Reasoning. Springer, 2009. Lamb, Borges, Garcez. Connectionist Model for Temporal Synchronisation and Learning. AAAI 2007, July 2007. Borges, Garcez, Lamb. Integrating Model Verification and Self-Adaptation. ASE 2010, September 2010. Garcez, Gabbay. Fibring Neural Networks. AAAI 2004, July 2004.

  25. Current Work • First Order Logic Learning: encoding vs. propositionalisation • Neural Networks for Normative Systems: obligations, permissions, contrary to duty • Adding domain knowledge to deep belief networks: higher order logic • Neural Networks for Abductive Reasoning: creativity, emotions, attention • Application in software engineering: model checking + adaptation • Application in simulation environments: driving test, war games, robocup

  26. Conclusion: Why Neurons and Symbols To study the statistical nature of learning and the logical nature of reasoning. To provide a unifying foundation for robust learning and efficient reasoning. To develop effective computational systems for integrated reasoning and learning.

Recommend


More recommend