1 / 37

Artificial Intelligence Lab

2018 BEKRAF Developer Day. Artificial Intelligence Lab. June 21, 2018. Introduction to Machine Learning, Deep Learning & Its Application. July 1, 2018. Topics: Machine Learning History of Machine Learning Component of Learning Types of Machine Learning Deep Learning

dlacey
Download Presentation

Artificial Intelligence Lab

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 2018 BEKRAF Developer Day Artificial IntelligenceLab June 21, 2018

  2. Introduction to Machine Learning, Deep Learning & Its Application July 1, 2018

  3. Topics: • Machine Learning • History of Machine Learning • Component of Learning • Types of Machine Learning • Deep Learning • History of Deep Learning • Traditional Learning vs Deep Learning • Image Classification Problem

  4. Machine Learning Problem Machine Learning is the science of getting computers to learn and act like humans do, and improve their learning over time in autonomous fashion, by feeding them data and information in the form of observations and real-world interactions.

  5. Why Machine Learning Important! “A.I. will make our lives ‘more productive and creative’”

  6. Machine Learning Problem • The Essence of Machine Learning • A Pattern Exists • We can not pin it down mathematically • We have data • Initial Problems that hard for human – but easy for computer. • Current Problems that hard for computer– but easy for human. Source : http://work.caltech.edu

  7. The Component of Learning • Problem formulation (credit approval) • Input : • Output : • Target function : • Data : • Hypothesis : Source : http://work.caltech.edu

  8. The Component of Learning UNKNOWN TARGET FUNCTION TRAINING EXAMPLES LEARNING ALGORITHM FINAL HYPOTHESIS HYPOTHESIS SET Source : http://work.caltech.edu

  9. History of Machine Learning Alan Turin Crate “Turing test” Modern Binary System born 1834 1642 1957 1679 1950 “Father of computer” ~ Punch card programming Frank Rosenblatt proposed perceptron algorithm First mechanical Calculator 2015 2006 A computer wins at the world's hardest boardgame Neural net research gets a reboot as “deep learning”

  10. Types of Machine Learning AnomalyDetection Target Marketing Customer Segmentation ImageRecognition ChurnPrediction RecommenderSystem Classification Data Visualization UnsupervisedLearning SupervisedLearning MarketForecasting Machine Learning Clustering Dimensionality Reduction FeatureSelection WeatherForecasting Regression PopulationPrediction StructureDiscovery Robot Movement ReinforcementLearning SkillsAcquisition LearningTasks Game AI

  11. Learning Puzzle Question :

  12. Machine Learning Tools Languages Frameworks Development Tools Data Platform

  13. How easy is Machine Learning (Programming) Recognizing Digits • Question • How fast human can recognize digits • How many line of codes?

  14. How easy is Machine Learning (Programming)

  15. Difference between AI, ML, and DL “Training” an algorithm so that it can learn how. Mostly neural networks Ability of machine to learn using data instead of hard coded rules. Deep layers General and narrow Not specifically neural Networks Can be hard coding

  16. What Is Deep Learning Neural networks are a subset of algorithm, built around a model of artificial neurons spread across layers. Deep Learning is generally used to describe particularly complex networks.

  17. History of Deep Learning XOR Problem Perceptron algorithm 1960 1943 1986 1957 1969 ADALINE MLP Back Propagation Electronic Brain 2012 2006 1995 Breakthrough (Imagenet) Deep Neural Network(Pretraining) SVM

  18. Back Propagation Algorithm • 2ndGeneration Neural Networks • We don’t know what the hidden units oughtto do, but we can compute how fast the errorchanges as we change a hidden activity. • Instead of using desired activities to train thehidden units, use error derivatives w.r.t. hiddenactivities. • Each hidden activity can affect many output unitsand can therefore have many separate effects onthe error. These effects must be combined. • We can compute error derivatives for all thehidden units efficiently at the same time. • Once we have the error derivatives for the hiddenactivities, its easy to get the error derivatives forthe weights going into a hidden unit.

  19. Deep Learning Grow Source: whatsthebigdata.com

  20. Traditional Learning VS Deep Learning Features Girl Classifier Feature Engineering Girl Deep Learning

  21. Handwritten Digits • Problem : Recognize Handwritten Digits • Solutions: • Rule-based system • Classical Machine Learning • Representation / Deep Learning Ruled-based System Classic Machine Learning Representation Learning

  22. Basic Neural Network (Image Recognition) Classifier Image Neural Network Model & Architecture Sequence / time series Regression

  23. Basic Neural Network (Image Recognition) Stretch pixels into single column Ship Cat Dog

  24. Image Classification Problem • Core Tasks : Recognize Objects • Image representation as 3D array or simply vector • Datasets: • Imagenet • Cifar10, Cifar100 • Etc. • Supervised approach

  25. Image Classification State-of-Arts Year 2012 Year 2013 Year 2014 Year 2015 AlexNet ZFNet GoogleNet VGGNet ResNet 1st 2nd 15.3% 14.8% 6.6% 7.3% 3.57%

  26. Image Classification Problem Key Challenges in image classification

  27. Training Deep Neural Networks • One Time Setup • Activation Functions • Data Preprocessing • Weight Initialization • Regularization • Gradient Descent Optimizers • Training Dynamics • Babysitting the Learning Process • Parameter Updates • Hyperparameter Optimization • Evaluation • Model Ensembles Heuristics Mini-Batch SGD Loop: • Sample a batch of data • Forward prop it through the network • Get the loss / in-sample error • Backprop to calculate gradient • Update parameters using gradient Source: http://cs231n.github.io/convolutional-networks

  28. Deep Learning Lack Theoretical Explanation? A debate between a winner of the Test-of-Time award and NIPS Conference from Google, Ali Rahimiin his presentation about “Machine Learning has become alchemy” and YannLeCun. The alchemy analogy was “insulting” and “wrong”. Simply because our current theoretical tools haven’t caught up with our practice is dangerous.” Dr. Yiran Chen, Director of the Duke Center of Evolutionary Lab“What Rahimi means is that we now lack a clear theoretical explanation of deep learning models and this part needs to be strengthened. LeCun means that lacking a clear explanation does not affect the ability of deep learning to solve problems. Theoretical development can lag behind,”

  29. Neural Networks Techniques Source : https://leonardoaraujosantos.gitbooks.io/artificial-inteligence/content/neural_networks.html

  30. Convolution Neural Network Pooling Layer : reduce the spatial dimensions (down sampling) ReLU Layer : apply the function in all elements on a input tensor Source: http://cs231n.github.io/convolutional-networks

  31. Convolution Neural Network Source: http://cs231n.github.io/convolutional-networks

  32. ConvNet Everywhere

  33. ConvNet Everywhere

  34. ConvNet Everywhere

  35. How easy Deep Learning (programming)

  36. Deep Learning Resources Fast.ai Deep Learning Course https://www.coursera.org/learn/machine-learning (Andrew Ng)

  37. Thank You

More Related