Tensor what? An introduction to AI on mobile Luke Sleeman - Freelance Android developer http://lukesleeman.com luke.sleeman@gmail.com @LukeSleeman
Agenda • Intro - History of AI, Recent breakthroughs, why AI on mobile is important • Part 1 - Key AI tech - Neural networks, image recognition, voice recognition, and Siri work • Part 2 - TensorFlow - What TensorFlow does, Server side use, embedding TensorFlow in a mobile app, models built with TensorFlow • Part 3 - Demo - We recognise a banana with my phone! • Closing thoughts - Strong AI, Why all this is important. • Questions
Intro - A bit of history
Hal 9000 is a great example of “Strong AI”. Hal can: • Reason (use strategy, solve puzzles, and make judgments under uncertainty) • Represent knowledge, including commonsense knowledge • Plan • Learn • Communicate in natural language • Integrate all these skills towards common goals • Kill All Humans https://en.wikipedia.org/wiki/Artificial_general_intelligence
10 Print “Good morning Dave, I am HAL 9000”
10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$
10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$ ???
10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$ 30 Print “I'm sorry, Dave. I'm afraid I can't do that.”
10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$ 30 Print “I'm sorry, Dave. I'm afraid I can't do that.” 40 Goto 20
The First AI Winter "within ten years a digital computer will be the world's chess champion" and "within ten years a digital computer will discover and prove an important new mathematical theorem.” - 1958 , H. A. Simon and Allen Newell "Within a generation ... the problem of creating 'artificial intelligence' will substantially be solved.” - 1967 , Marvin Minsky
Present Day • IBM’s Deep blue beats Kasparov - 1997 • DARPA self driving car challenge - 2005, 2007 • IBM’s Watson wins jeopardy - 2011 • Googles AlphaGo beats GO champion - 2015
AI + Mobile = A great combo
Why AI on mobile is important Constrained input + limited interaction means intelligence is important
Part 1 - Key AI Tech
Key AI Tech - Neural Networks
Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > 0){ … we are in! } else { … we are out! }
Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > 0){ … we are in! } else { … we are out! }
Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > 0){ … we are in! } else { … we are out! }
Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > b ){ … we are in! } else { … we are out! }
Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(( w1 * x1) + ( w2 * x2) > b){ … we are in! } else { … we are out! }
Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if((w1 *x1) + (w2 * x2) > b){ return 1.0; } else { return 0.0; }
Neural Networks - Classification problem
Neural Networks - Classification problem Lets do a demo!
Neural networks - Summary • They learn to recognise patterns from training data! • Work for complex patterns • Can take a long time to train • You have to choose the right hyper-parameters (number of nodes, how they are laid out, activation function, input transformation, etc)
Key AI Tech - Image search and Deep belief networks
Image search
Deep belief networks • Stack a number of layers together to make a DBN • Early layers learn to recognise simple features • Later use features to recognise larger objects • A type of ‘auto-encoder’ - learns from input • Early unsupervised training to recognise features • Later supervised training to learn what they ‘mean’
Deep belief network layers Unsupervised Learning of Hierarchical Representations with Convolutional Deep Belief Networks Honglak Lee, Roger Grosse, Rajesh Ranganath, and Andrew Y. Ng
Deep belief networks • Stack a number of layers together to make a DBN • Early layers learn to recognise simple features • Later use features to recognise larger objects • A type of ‘auto-encoder’ - learns from input • Early unsupervised training to recognise features • Later supervised training to learn what they ‘mean’
Key AI Tech - Speech recognition and Recurrent Neural Networks
Speech Recognition
Speech recognition and Recurrent Neural Networks (RNNs) • Everything up to now is a feed forward network • In RNN’s output is fed back in as input • Feedback serves as a type of short term memory! • Takes a sequence of inputs • Supplies results over time
Key AI Tech - Ok Google, Siri and Recursive Neural Tensor Networks
Ok Google, Hey Siri
Sentence parsing
Alice drove down the street in her car
Recursive Neural Tensor Networks (RNTN’s) Root • Good for recognising hierarchal structures • Tree like structure - root node + left and right • Just like RNN’s, the complexity is in how Leaf Leaf they are invoked
Recursive Neural Tensor Networks fast is The car
Recursive Neural Tensor Networks The car
Recursive Neural Tensor Networks Class, Score The car
Recursive Neural Tensor Networks is The car
Recursive Neural Tensor Networks fast is The car
Recursive Neural Tensor Networks fast is The car
Part 2 - TensorFlow
Tensor What?
TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>
TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>
TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>
TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>
TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>
TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))
TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))
TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))
TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))
Building an app with TensorFlow
Tensor No
Recommend
More recommend