Towards Quantum-Assisted Artificial Intelligence Peter Wittek Research Fellow, Quantum Information Theory Group ICFO-The Institute of Photonic Sciences Barcelona Institute of Science and Technology & Academic Director, Quantum Machine Learning Initiative Creative Destruction Lab University of Toronto November 2017
Max Tegmark (2017). Life 3.0.
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary The Virtuous Cycle! •Quantum algorithms in machine learning •Improved sample and computational complexity Machine learning Quantum information processing and and arti fi cial intelligence quantum computing •Reinforcement learning in control problems •Deep learning and neural networks as representation Towards Quantum-Assisted AI November 2017 5 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Diversity is key Statistical learning theory Agency Marco Loog Vedran Dunjko Gael Sent´ ıs (Discrete) Optimization Quantum data Davide Venturelli Jonathan Olson William Santos Antonio Gentile Sampling Nana Liu Alejandro Perdomo-Ortiz John Calsamiglia Causal networks, kangaroos, Deep architectures & Many-body physics and cockroaches William Huggins Christina Giarmatzi Shi-Ju Ran Andreas Winter Alexandre Dauphin Towards Quantum-Assisted AI November 2017 6 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary So let’s add one more component. . . Good Old-Fashioned AI Formalize causal relations in higher order logic. Classical data in, classical data out. Complexity of entailment is in NP. Fragile and largely dead since the 1980s. Add uncertainty Bump complexity to #P. Sampling helps. Took off in 2006, still a niche. Towards Quantum-Assisted AI November 2017 7 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary First-order logic Basic components (abbreviated): Constant : representing objects in the domain. E.g., Alice , Bob . Variable : taking values in a domain, e.g., people. Predicate : representing relations among objects, e.g., Flies(x) , Physicist(y) , Coauthors(x,y) . Formulas: Atom : predicate applied to a tuple of objects. E.g., Coauthors(x, Bob) . Ground atom : atom with no variable. E.g., Coauthors(Alice, Bob) . Formula : atoms with logical connectives and quantifiers. E.g., ∀ x (Flies(x) ⇒ Flies(MotherOf(x))) . Towards Quantum-Assisted AI November 2017 8 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Knowledge base Knowledge base (KB): conjunctive set of formulas. Every referee is competent: ∀ x,y (Referees(x,y) ⇒ Competent(x)) Referees of physicists are physicists: ∀ x,y (Referees(x,y) ∧ Physicist(y) ⇒ Physicist(x)) Towards Quantum-Assisted AI November 2017 9 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Grounding out and Herbrand interpretation Finite domain: { Alice, Bob } Functions are not relevant, they serve as substitutions. Grounding out the atoms grow exponentially: Referees(Alice,Bob) , Referees(Bob,Alice) , Referees(Alice,Alice) , Referees(Bob,Bob) . Competent(Alice) , Competent(Bob) . Physicist(Alice) , Physicist(Bob) . Grounding out the knowledge base: Referees(Alice,Bob) ⇒ Competent(Alice) Referees(Bob,Alice) ⇒ Competent(Bob) Referees(Alice,Alice) ⇒ Competent(Alice) . . . Herbrand interpretation (possible world ): assign a truth value to each ground atom. Towards Quantum-Assisted AI November 2017 10 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Deduction Let us have a formula from outside the KB: F: Bob referees Alice. Bob is not competent. Referees(Bob, Alice) ∧¬ Competent(Bob) Problem of entailment: KB � F. What about contradictions? Towards Quantum-Assisted AI November 2017 11 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Restricted Boltzmann machines � � � � E ( v , h ) = − a i v i − b j h j − v i w i , j h j i j i j Obtain a probability distribution: P ( v , h ) = 1 Z e − E ( v , h ) Trace out over the hidden nodes to approximate a target probability distribution. This is a generative probabilistic model. Towards Quantum-Assisted AI November 2017 12 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Probabilistic Graphical Models Uncertainty (probabilities) and logical structure (independence constraints). Goal: compact representation of a joint probability distribution. For { X 1 , . . . , X N } binary random variables, there are 2 N assignments. Complexity is dealt through graph theory. Factorization: compactness. Inference: reassembling factors. Conditional independence ( X ⊥ Y | Z ): P ( X = x , Y = y | Z = z ) = P ( X = x | Z = z ) P ( Y = y | Z = z ) ∀ x ∈ X , y ∈ Y , z ∈ Z Towards Quantum-Assisted AI November 2017 13 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Markov random fields Ising model generalized to hypergraphs. A distribution factorizes over G if: P ( X 1 , . . . , X N ) = 1 Z P ′ ( X 1 , . . . , X N ),, where P ′ ( X 1 , . . . , X N ) = exp( − � i ǫ [ C k ]) and C i is a clique in G . Connection to Boltzmann machines Towards Quantum-Assisted AI November 2017 14 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Probabilistic inference and learning in Markov networks How to apply the learned model? Complexity is in #P. Two types of queries: Conditional probability: P ( Y | E = e ) = P ( Y , e ) P ( e ) . � Maximum a posteriori: argmax y P ( y | e ) = argmax y Z P ( y , Z | e ). Generic case for approximation: Markov chain Monte Carlo Gibbs sampling. What prevents from accelerating this with quantum-enhanced sampling? Towards Quantum-Assisted AI November 2017 15 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Statistical relational learning Uncertainty + relational structure. Combine (first-order) logic and probabilistic graphical models. Non-IID data. Markov logic networks are a type of statistical relational learning. Towards Quantum-Assisted AI November 2017 16 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Markov logic networks Real world can never match a KB. Weight each formula in a KB: high weight indicates high probability. Markov Logic Network Apply a KB { F i } with matching weights { w i } to a finite set of constants C to define a Markov network: Add a binary node for each possible grounding for each atom. Add a binary feature for each possible grounding of each formula F i . It is like a template to generate Markov networks. Towards Quantum-Assisted AI November 2017 17 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary An example ∀ x,y (Referees(x,y) ⇒ Competent(x)) ∀ x,y (Referees(x,y) ∧ Physicist(y) ⇒ Physicist(x)) C= { Alice, Bob } Referees(Alice,Alice) Physicist(Alice) Competent(Alice) Referees(Alice,Bob) Referees(Bob,Alice) Competent(Bob) Physicist(Bob) Referees(Bob,Bob) Towards Quantum-Assisted AI November 2017 18 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary What do we gain? First-order logic is recovered in the limit of uniform weights. Unlikely statements will be assigned a low probability. Cross-over between formal reasoning and probabilistic inference. Towards Quantum-Assisted AI November 2017 19 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Include evidence Physicist(Alice) Referees(Alice,Alice) Competent(Alice) Competent(Bob) Referees(Alice,Bob) Referees(Bob,Alice) Physicist(Bob) Referees(Bob,Bob) We have true evidence for these: Physicist(Bob) Referees(Alice,Bob) Towards Quantum-Assisted AI November 2017 20 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Assignment 1 Referees(Alice,Alice) Physicist(Alice) Competent(Alice) Competent(Bob) Referees(Alice,Bob) Referees(Bob,Alice) Physicist(Bob) Referees(Bob,Bob) With this assignment, we actually violate two formulas: Referees(Alice,Bob) ⇒ Competent(Alice) Referees(Alice,Bob) ∧ Physicist(Bob) ⇒ Physicist(Alice) Towards Quantum-Assisted AI November 2017 21 / 35
Introduction GOFAI Probabilistic Graphical Models Statistical Relational Learning Quantum Gibbs Sampling Summary Assignment 2 Physicist(Alice) Referees(Alice,Alice) Competent(Alice) Competent(Bob) Referees(Alice,Bob) Referees(Bob,Alice) Physicist(Bob) Referees(Bob,Bob) Towards Quantum-Assisted AI November 2017 22 / 35
Recommend
More recommend