Presentation on "Deep Learning Fundamentals - Architecture and Applications" delivered by Kwadwo Agyapon-Ntra, Entrepreneur in Training, Meltwater Entrepreneurial School of Technology.
3. ● Deep learning is taking over
● AI, ML & DL at a glance
● A quick walkthrough of ML basics
● Perceptron
● Artificial neural networks
● Deep Neural Networks
● A look under the hood
● DNN Acceleration - Why now?
● Architectures & Applications
● Case Study: AlphaGo
Stats: Forbes Article
- Bernard Marr
Overview
IndabaX Ghana
4.
5. AI, ML & DL in One Slide
IndabaX Ghana
AI involves machines that can
perform tasks that are characteristic
of human intelligence
ML is the ability to learn without
being explicitly programmed”
(...one way of achieving AI)
DL is one of many approaches to
machine learning, that relies on
Artificial Neural Networks
6. IndabaX Ghana
Machine Learning at a Glance
Mainly:
● Linear Regression
(Interpolation / Extrapolation)
● Logistic Regression
(Classification)
Data is key
7. IndabaX Ghana
The Gradient Descent Algorithm
The Holy Grail of ML
● Start with arbitrary weights
● Make prediction(s)
● Calculate loss (actual - prediction)
● Adjust weights to minimize loss (optimization)
● Iterate till loss converges
Mathematically
InEnglish
y = W*x + b
8. IndabaX Ghana
A Simple Complex Problem
The XOr Paradox
The Perceptron
● The foundation of neural networks
(resembles a neuron)
● A condensed representation of a regression
model (sort of)
● Works well on linearly separable problems
● Sucks at problems like the one on the right
● There’s no line that can separate these
values
9. IndabaX Ghana
Artificial Neural Networks
A feedforward network with a single
layer is sufficient to represent any
function, but the layer may be
infeasibly large and may fail to learn
and generalize correctly.
— Ian Goodfellow, Director of ML, Apple Inc.
Essentially, any problem that can be
modelled as a function can theoretically be
solved using a single-layer neural network.
The Universal Approximator Theorem
10. IndabaX Ghana
Deep Neural Networks
“Go deep or go home”
● HIgher dimensionality
● Features extracted automagically
* End-to-end learning
* Transfer learning
11. IndabaX Ghana
Deep Neural Networks
Under the Hood
● Forward propagation
○ Pretty straightforward
● Back propagation
○ Chain rule of calculus
● Activation functions
○ Sigmoid
○ Tanh
○ ReLU
● Regularization techniques
○ Dropout
○ Batch, weight, layer normalization
○ Mixup (Moustapha recommends we try this)
13. A modern smartphone has
more computing power than
the entire NASA during the
moon landings.
Graphical Processing Units
(GPU’s) have revolutionized
parallel processing.
Cloud computing has availed
the power of numerous
powerful computers to
regular people.
IndabaX Ghana
Acceleration of Deep Neural Networks
2. GPU’s, TPU’s & Extra Computational Power
14. ● 2.5 Quintillion (1018) bytes of data produced daily
● 90% of the world’s data was generated in the last 2 years
● Every second:
○ 40,000 Google searches every second
● Every minute:
○ 527,760 photos shared on Snapchat
○ 4,146,600 videos watched on YouTube
○ 456,000 tweets sent
○ 293,000 Facebook status updates
○ 16 million text messages sent
○ 990,000 Tinder swipes
○ 156 million emails sent
● Projected 200 billion IoT devices by 2020
Stats: Forbes Article
- Bernard Marr
Acceleration of Deep Neural Networks
3. Explosion of Data through the Internet
IndabaX Ghana
18. IndabaX Ghana
A Few Legendary Models
Honourable mentions:
● YoLo v3 - Real-time object detection
● Wide & Deep - Mix of memorization & Generalization (by Google
Name Number of Layers
LeNet 5
AlexNet 8
VGG 11 - 19
GoogLeNet 22
Inception v4 76
Resnet 34-152, 1001, 1202
19. AlphaGo (2016)
Google DeepMind program beats 18-
time Go world-champion, Lee Sedol.
4 : 1
More on AlphaGo
IndabaX Ghana
Case Study
20. AlphaGo Zero (2017)
Google DeepMind releases new
version of AlphaGo: AlphaGo Zero.
AlphaGo Zero learns from scratch
destroys its predecessor, AlphaGo.
100 : 0
More on AlphaGo Zero
IndabaX Ghana
Case Study
Computer engineer taking a deep dive into AI, ML and DL.
Recently accepted into
Spare time (Non-coding time): I read, watch TED talks, write on my blog, play my guitar