Data Analytics and Machine Learning Cheng Zihan Hor Jasrene Joshua Tan EEE03
Content Outline 1. 1. Int ntrodu ductio ction 2. 2. Aims & Objec ectiv tives es 3. Method 3. odolo logy gy 4. 4. Results ults & Discus cussion sion 5. 5. Conclusion usion 2
1. Introduction What is Machine Learning?
What is Machine Learning? ▰ Application of AI ▰ Identifying patterns & make decisions based on past data ▰ Analyse & interpret trends to generate new insights ▰ Handle massive data without human intervention 4
Applications ▰ Online Fraud Detection ▰ Malware Filtering ▰ Facial Recognition ▰ Virtual Personal Assistants 5
Classifications of Machine Learning Algorithms Supervised Learning Unsupervised Learning ▰ Past st data ta to train ain alg lgori rithm hm ▰ Information to train data is not classified ▰ Compa mpare re predicted dicted wit ith h exp xpect ected d outpu tput ▰ Model opti timis isation tion to reduce ce error ▰ Cla lassif sificat ication ion & Regr gression ession 6
2. Aims & Objectives Why did we work on this project?
Aims & Objectives ▰ For Perceptron Learning Algorithm & Adaline Gradient Descent Algorithm to undergo supervised learning ▰ Compare effectiveness of algorithms to classify data 8
3. Methodology How did we carry out the project?
Data Sets ▰ Iris.data set for analysis by algorithms ○ 3 classes of 50 instances each ▰ 5 Attributes ○ Sepal Length ○ Sepal Width ○ Petal Length ○ Petal Width ○ Type of Iris Plant 10
Visual representation of data training set 11
Perceptron Updates using individual instances of data 12
Perceptron Learning Rule ▰ Perceptron ron trained d with past data classifi sified ed into matrice ices ▰ Aggregates tes input based d on the we weights to gen ener erate e an o outpu put ▰ AIM: : Generate te weights of a perceptron on for r each attribute e accurat rately ely 13
Perceptron Learning Rule p (a) z i = w 0 + ∑ w j x i,j ; j = 1 with z being the intermediate , w the weight and x the input. (b) Limit added to value of z i to fit the output y’ i : y’ i = ⎰ 1 if z i ≥ 0; ⎱ -1 otherwise. 14
Perceptron Learning Rule (c) Update Weights of Perceptron: Calculate error e i -- difference between the output and the target class label -- and multiply it by the update step-size. e i = y i - y’ i ; w 0 := w 0 + η e i w := w + η x i e i. 15
Adaline Updates using gradient of training data 16
Similarities ▰ Classifiers ○ Iris data converted to numerical format ○ Recognisable by algorithms ▰ Weights ○ Prediction of data points ○ Continuously updated to minimise error ▰ Decision boundary ○ Classification of data by algorithms 17
Experimental Procedure Weight Algorithm Comparing Comparing generation class error decision and created graphs boundaries updates 18
4. Results & Discussion What did we find out from the project? 19
classified breeds: setosa Classification graph and vesicolor ▰ used in comparison with the classifications of the respective algorithms to determine their accuracy and sensitivities. 20
Comparison of magnitude blue: Perceptron Algorithm orange: Adaline of errors produced Algorithm lower rate of learning higher rate of learning 21
Classification by respective Algorithms Perceptron Algorithm Adaline Algorithm ▰ Perceptron Algorithm is more sensitive to outliers than Adaline Algorithm 22
Conclusion ▰ Perceptron Learning Algorithm is more sensitive and accurate in the classification of data as compared to the Adaline Algorithm. ▰ important indications on the unlimited potential in training and classifying data 23
ACKNOWLEDGEMENTS ▰ Supervisor: Prof. Andy W.H. Khong ▰ School teacher mentors: Mr Nicholas Wong Mr Lim Cher Chuan Dr Goh Ker Liang 24
THE END Thank you so much for your time! 25
Recommend
More recommend