550 likes | 1.02k Views
2018 BEKRAF Developer Day. Artificial Intelligence Lab. June 21, 2018. Introduction to Machine Learning, Deep Learning & Its Application. July 1, 2018. Topics: Machine Learning History of Machine Learning Component of Learning Types of Machine Learning Deep Learning
E N D
2018 BEKRAF Developer Day Artificial IntelligenceLab June 21, 2018
Introduction to Machine Learning, Deep Learning & Its Application July 1, 2018
Topics: • Machine Learning • History of Machine Learning • Component of Learning • Types of Machine Learning • Deep Learning • History of Deep Learning • Traditional Learning vs Deep Learning • Image Classification Problem
Machine Learning Problem Machine Learning is the science of getting computers to learn and act like humans do, and improve their learning over time in autonomous fashion, by feeding them data and information in the form of observations and real-world interactions.
Why Machine Learning Important! “A.I. will make our lives ‘more productive and creative’”
Machine Learning Problem • The Essence of Machine Learning • A Pattern Exists • We can not pin it down mathematically • We have data • Initial Problems that hard for human – but easy for computer. • Current Problems that hard for computer– but easy for human. Source : http://work.caltech.edu
The Component of Learning • Problem formulation (credit approval) • Input : • Output : • Target function : • Data : • Hypothesis : Source : http://work.caltech.edu
The Component of Learning UNKNOWN TARGET FUNCTION TRAINING EXAMPLES LEARNING ALGORITHM FINAL HYPOTHESIS HYPOTHESIS SET Source : http://work.caltech.edu
History of Machine Learning Alan Turin Crate “Turing test” Modern Binary System born 1834 1642 1957 1679 1950 “Father of computer” ~ Punch card programming Frank Rosenblatt proposed perceptron algorithm First mechanical Calculator 2015 2006 A computer wins at the world's hardest boardgame Neural net research gets a reboot as “deep learning”
Types of Machine Learning AnomalyDetection Target Marketing Customer Segmentation ImageRecognition ChurnPrediction RecommenderSystem Classification Data Visualization UnsupervisedLearning SupervisedLearning MarketForecasting Machine Learning Clustering Dimensionality Reduction FeatureSelection WeatherForecasting Regression PopulationPrediction StructureDiscovery Robot Movement ReinforcementLearning SkillsAcquisition LearningTasks Game AI
Learning Puzzle Question :
Machine Learning Tools Languages Frameworks Development Tools Data Platform
How easy is Machine Learning (Programming) Recognizing Digits • Question • How fast human can recognize digits • How many line of codes?
Difference between AI, ML, and DL “Training” an algorithm so that it can learn how. Mostly neural networks Ability of machine to learn using data instead of hard coded rules. Deep layers General and narrow Not specifically neural Networks Can be hard coding
What Is Deep Learning Neural networks are a subset of algorithm, built around a model of artificial neurons spread across layers. Deep Learning is generally used to describe particularly complex networks.
History of Deep Learning XOR Problem Perceptron algorithm 1960 1943 1986 1957 1969 ADALINE MLP Back Propagation Electronic Brain 2012 2006 1995 Breakthrough (Imagenet) Deep Neural Network(Pretraining) SVM
Back Propagation Algorithm • 2ndGeneration Neural Networks • We don’t know what the hidden units oughtto do, but we can compute how fast the errorchanges as we change a hidden activity. • Instead of using desired activities to train thehidden units, use error derivatives w.r.t. hiddenactivities. • Each hidden activity can affect many output unitsand can therefore have many separate effects onthe error. These effects must be combined. • We can compute error derivatives for all thehidden units efficiently at the same time. • Once we have the error derivatives for the hiddenactivities, its easy to get the error derivatives forthe weights going into a hidden unit.
Deep Learning Grow Source: whatsthebigdata.com
Traditional Learning VS Deep Learning Features Girl Classifier Feature Engineering Girl Deep Learning
Handwritten Digits • Problem : Recognize Handwritten Digits • Solutions: • Rule-based system • Classical Machine Learning • Representation / Deep Learning Ruled-based System Classic Machine Learning Representation Learning
Basic Neural Network (Image Recognition) Classifier Image Neural Network Model & Architecture Sequence / time series Regression
Basic Neural Network (Image Recognition) Stretch pixels into single column Ship Cat Dog
Image Classification Problem • Core Tasks : Recognize Objects • Image representation as 3D array or simply vector • Datasets: • Imagenet • Cifar10, Cifar100 • Etc. • Supervised approach
Image Classification State-of-Arts Year 2012 Year 2013 Year 2014 Year 2015 AlexNet ZFNet GoogleNet VGGNet ResNet 1st 2nd 15.3% 14.8% 6.6% 7.3% 3.57%
Image Classification Problem Key Challenges in image classification
Training Deep Neural Networks • One Time Setup • Activation Functions • Data Preprocessing • Weight Initialization • Regularization • Gradient Descent Optimizers • Training Dynamics • Babysitting the Learning Process • Parameter Updates • Hyperparameter Optimization • Evaluation • Model Ensembles Heuristics Mini-Batch SGD Loop: • Sample a batch of data • Forward prop it through the network • Get the loss / in-sample error • Backprop to calculate gradient • Update parameters using gradient Source: http://cs231n.github.io/convolutional-networks
Deep Learning Lack Theoretical Explanation? A debate between a winner of the Test-of-Time award and NIPS Conference from Google, Ali Rahimiin his presentation about “Machine Learning has become alchemy” and YannLeCun. The alchemy analogy was “insulting” and “wrong”. Simply because our current theoretical tools haven’t caught up with our practice is dangerous.” Dr. Yiran Chen, Director of the Duke Center of Evolutionary Lab“What Rahimi means is that we now lack a clear theoretical explanation of deep learning models and this part needs to be strengthened. LeCun means that lacking a clear explanation does not affect the ability of deep learning to solve problems. Theoretical development can lag behind,”
Neural Networks Techniques Source : https://leonardoaraujosantos.gitbooks.io/artificial-inteligence/content/neural_networks.html
Convolution Neural Network Pooling Layer : reduce the spatial dimensions (down sampling) ReLU Layer : apply the function in all elements on a input tensor Source: http://cs231n.github.io/convolutional-networks
Convolution Neural Network Source: http://cs231n.github.io/convolutional-networks
Deep Learning Resources Fast.ai Deep Learning Course https://www.coursera.org/learn/machine-learning (Andrew Ng)