SlideShare uma empresa Scribd logo
1 de 27
Baixar para ler offline
November 17, 2017
Catherine Diep, Peter Tan
Running An AI Workload
With IBM PowerAI
Agenda
Deep learning overview
IBM PowerAI and deep learning
TensorFlow frameworks
Image Classification example with the MNIST Dataset
Running the workload on IBM PowerAI
2
Deep Learning basic operations
 Map x -> y
 Neural net is a graph
 Data flow: left to right
 Input(s) of the current cell are the
output(s) from previous cells
 Tweak all weights until output
matches expected outcome
3
x y
Deep Learning consists of algorithms that permit software to train itself— by exposing multilayered
neural networks to vast amounts of data
Computation at a node
4
Training Flow
• Continuously feed in input data to model, comparing output predictions and actual labels in order to
compute loss.
• An optimization algorithm is used to minimize this loss by tweaking the weights and biases of the model.
• The model is progressively improved and predictions become more accurate as more data is fed.
Compute
(Run input through
model)
Compute loss
(Compare output to
label)
Adjust weights
and biases
(Minimize loss)
Output data
(Predictions)
Input data
- Gradient Descent Algorithm
- Adam Optimization Algorithm
- …
- Cross Entropy
- Mean Squared Error
- …
5
PowerAI
PowerAI is an enterprise software distribution of popular open-source deep learning frameworks
• Enterprise ready SW distribution built on open source
• Performance- for faster training times
• Tools for ease of development
• More information can be found at IBM Marketplace PowerAI Portal
Visit the IBM developerWorks for more learning materials and demonstrations
6
TensorFlow From Google
 TensorFlow is an open source software library for numerical computation using
data flow graphs
 A framework for deep learning models
 Open sourced November 2015
 Re-designed for research and production
 Rapid adoption
• 5,500 github repo with Tensorflow in the title
• Taught in universities classes (Stanford, Berkeley, Toronto, ....)
 Fast growing community
• 12K+ Q&A on stackoverflow
 IBM support:
• IBM PowerAI, Power System S822LC, record speed announced
• IBM Data Science Experience
• Watson Machine Learning Platform
7
TensorFlow Graph in Python
1. Construct graph: use tf objects
• Node: operation
• Edge: data (tensor)
2. Execute graph: connect to runtime
• Initialize variables
• Load data, feed through graph
• Train model: compute parameters, loss
• Save checkpoints
• Distribute workload
8
Runtime
Traningdata
prediction
Compute, loss function
MNIST Dataset
9
The MNIST (Modified National Institute of Standards and Technology) dataset
consists of 60,000 images of handwritten digits like:
Each image has an associated label denoting which digit it is.
The above images would have labels 5, 0, 4, and 1.
Problem Description: Image Classification
10
We want to be able to train a deep learning model using the MNIST dataset that
will be able to look at images and predict what digits they are.
10
Computer Vision
11
How machines view images:
Tensors for Regression
12
 Placeholder:
• 28x28 image flattened into a vector:
[784]
 Variables:
• Weight is a 2D array: [784,10]
• Bias is a vector: [10]
 Prediction is simply:
• y = x * weight + bias
 Optimizer adjusts weight and bias to
minimize loss (error)
• Predicted y is close to label
MatMul
Add
Softmax
n x 784
784 x 10
10
X
Y
W
b
n x 10
CPU
GPU
Deploy
with a
session
to run on
CPU or
GPU
What is Softmax?
13
 Normalized exponential function
 Function that is good for assigning probabilities to an object being one of
several things.
 A softmax regression has two steps:
• Add up the evidence of our input being in certain classes
• Convert that evidence into probabilities
 Sum of all outputs will be equal to 1.0
 softmax(𝑦𝑖) =
𝑒
𝑦𝑖
𝑖=1
𝑛 𝑒
𝑦𝑖
Implement Read data
14
from tensorflow.examples.tutorials.mnist import input_data
import tensorflow as tf
def main():
mnist = input_data.read_data_sets("MNIST_data/", one_hot=True)
if __name__ == '__main__':
main()
 input_data is a utility function provided by TensorFlow to retrieve MNIST dataset
 One_hot refers to how the labels will be represented: as one-hot vectors
 One-hot vector is a vector which is 0 in most dimensions, and 1 in a single
dimension.
 E.g. 3 = [0, 0, 0, 1, 0, 0, 0, 0, 0, 0]
Implement Placeholders
15
 Placeholders are input
 x is a 2D array for the images:
• Each row is one flattened 28x28 image
• First dimension is “None”, to be used to pull in a batch of images at a time
(more later)
 y_ is 2D array for the labels:
• Second dimension 10 for the one-hot representation
# Placeholder that will be fed image data.
x = tf.placeholder(tf.float32, [None, 784])
# Placeholder that will be fed the correct labels.
y_ = tf.placeholder(tf.float32, [None, 10])
15
Implement Weight and Bias
16
 Weight and Bias are variables: to be tweaked during training
• Weight is a 2D array: 784 x 10
• Bias is a vector: 10
 Initialized with certain values: important for optimization algorithm
def weight_variable(shape):
"""Generates a weight variable of a given
shape."""
initial = tf.truncated_normal(shape, stddev=0.1)
return tf.Variable(initial)
def bias_variable(shape):
"""Generates a bias variable of a given shape."""
initial = tf.constant(0.1, shape=shape)
return tf.Variable(initial)
# Define weight and bias.
W = weight_variable([784, 10])
b = bias_variable([10])
16
Implement Regression and Loss Optimizer
17
 Neural network: Regression + Softmax
 Loss function: how far off is the prediction from the label
• cross_entropy = −1/𝑁 𝑖=1
𝑁
𝑦_𝑖 ∗ log(𝑦𝑖 ) where 𝑦_𝑖 = label, 𝑦𝑖 = predict
 Optimizer algorithm: how to tweak the variables
# Here we define our model which utilizes the softmax regression.
y = tf.nn.softmax(tf.matmul(x, W) + b)
# Define our loss.
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y),
reduction_indices=[1]))
# Define our optimizer.
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)
Create Session connecting to Runtime
18
 Create a session
 You can connect to a runtime on a remote cluster for large scale training
• Distributed Tensorflow
 Different types of session:
• Normal session to run full training
• Interactive session for modifying neural network on the fly
# Launch session.
sess = tf.InteractiveSession()
# Initialize variables.
tf.global_variables_initializer().run()
18
Train and Evaluate Model
19
 Here, we run our training step 1100 times, feeding in batches of data to replace the
placeholders
 The batches are random data points we retrieve from our image training set
 We then check the model with the test data to get our accuracy
# Do the training.
for i in range(1100):
batch = mnist.train.next_batch(1)
sess.run(train_step, feed_dict={x: batch[0], y_: batch[1]})
# See how model did.
print("Test Accuracy %g" % sess.run(accuracy, feed_dict={
x:mnist.test.images, y_: mnist.test.labels}))
Demo on a IBM PowerAI Trial server
• IBM has partnered with Nimbix to provide cognitive developers a trial account
that provides 24-hours of free processing time on the PowerAI platform
• Go to the IBM Marketplace PowerAI Portal
• Click the “Request trial” button
• Follow the instruction provided to register and access your IBM PowerAI Trial
environment
• Demo on the trial server can be found at http://localhost:8888/tree/demo
20
21
1. Logon to Ubuntu as user “nimbix” with the password provided by Nimbix
Bring up a terminal window, ssh to the nimbix server (i.e. NAE-165-254-189-20.jarvice.com)
as nimbix user with provided password.
=> ssh nimbix@NAE-165-254-189-20.jarvice.com
22
2. Download and Install Anaconda
# Download Miniconda
=> cd
=> wget -c https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-ppc64le.sh
# Install Miniconda
=> cd
=> chmod 744 Miniconda3-latest-Linux-ppc64le.sh
=> ./Miniconda3-latest-Linux-ppc64le.sh
## and following the online instruction to finish the install.
## Answer “yes” to install location to .bashrc file
# logoff the nimbix and log back to nimbix. Or, do the following command
=> source ~/.bashrc
23
3. Create an Miniconda environment with Python 2.7
=> conda create -n image_cls python=2.7
4. Activate the Conda environment
=> source activate image_cls
5. Activate nvidia libraries
=> export PATH="/usr/lib/nvidia-361/bin:/usr/local/cuda-8.0/bin:$PATH"
=> export CUDA_HOME=/usr/local/cuda-8.0
=> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda-8.0/lib64
24
6. Prepare TensorFlow
# Install numpy (Only need to run this step once)
=> pip install numpy
# Activate tensorflow
=> source /opt/DL/tensorflow/bin/tensorflow-activate
# Check to see if tensorflow is ready
=> pip list |grep tensorflow
(image_cls) nimbix@JARVICENAE-0A0A1847:~$ pip list |grep tensorflow
DEPRECATION: The default format will switch to columns in the future. You can
use --format=(legacy|columns) (or define a format=(legacy|columns) in your
pip.conf under the [list] section) to disable this warning.
tensorflow (1.1.0)
(image_cls) nimbix@JARVICENAE-0A0A1847:~$
25
7. Digital Image Classification workload Install.
=> cd
# Download the digital image classification workload.
=> git clone https://github.com/pvaneck/tf_mnist
8. Training.
=> cd tf_mnist
=> python ./train_basic_model.py ## the training result will be
saved in ~/tf_mnist/saved-model
directory
26
9. Prediction
# Predict the class of an image using the models saved in the ~/tf_mnist/saved-model directory with a sample
image from ~/tf_mnist/sample-images directory
=> python ./classify_mnist.py sample-images/img_1.jpg
# This is what img_1.jpg image looks like =>
# This is the program output:
2 (confidence = 0.99987)
3 (confidence = 0.00010)
0 (confidence = 0.00003)
8 (confidence = 0.00000)
5 (confidence = 0.00000)
The result shows that the answer with the most confidence is “2”
27

Mais conteúdo relacionado

Mais procurados

Build your own Convolutional Neural Network CNN
Build your own Convolutional Neural Network CNNBuild your own Convolutional Neural Network CNN
Build your own Convolutional Neural Network CNNHichem Felouat
 
Pytorch for tf_developers
Pytorch for tf_developersPytorch for tf_developers
Pytorch for tf_developersAbdul Muneer
 
PyTorch Tutorial for NTU Machine Learing Course 2017
PyTorch Tutorial for NTU Machine Learing Course 2017PyTorch Tutorial for NTU Machine Learing Course 2017
PyTorch Tutorial for NTU Machine Learing Course 2017Yu-Hsun (lymanblue) Lin
 
Machine Learning - Introduction to Tensorflow
Machine Learning - Introduction to TensorflowMachine Learning - Introduction to Tensorflow
Machine Learning - Introduction to TensorflowAndrew Ferlitsch
 
Overview of Chainer and Its Features
Overview of Chainer and Its FeaturesOverview of Chainer and Its Features
Overview of Chainer and Its FeaturesSeiya Tokui
 
Alex Smola at AI Frontiers: Scalable Deep Learning Using MXNet
Alex Smola at AI Frontiers: Scalable Deep Learning Using MXNetAlex Smola at AI Frontiers: Scalable Deep Learning Using MXNet
Alex Smola at AI Frontiers: Scalable Deep Learning Using MXNetAI Frontiers
 
Accelerating Random Forests in Scikit-Learn
Accelerating Random Forests in Scikit-LearnAccelerating Random Forests in Scikit-Learn
Accelerating Random Forests in Scikit-LearnGilles Louppe
 
Introducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlowIntroducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlowEtsuji Nakai
 
Neural networks with python
Neural networks with pythonNeural networks with python
Neural networks with pythonSimone Piunno
 
PyTorch for Deep Learning Practitioners
PyTorch for Deep Learning PractitionersPyTorch for Deep Learning Practitioners
PyTorch for Deep Learning PractitionersBayu Aldi Yansyah
 
Learning stochastic neural networks with Chainer
Learning stochastic neural networks with ChainerLearning stochastic neural networks with Chainer
Learning stochastic neural networks with ChainerSeiya Tokui
 
Scaling Deep Learning with MXNet
Scaling Deep Learning with MXNetScaling Deep Learning with MXNet
Scaling Deep Learning with MXNetAI Frontiers
 
Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorchJun Young Park
 
Introduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at BerkeleyIntroduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at BerkeleyTed Xiao
 
Introduction To Generative Adversarial Networks GANs
Introduction To Generative Adversarial Networks GANsIntroduction To Generative Adversarial Networks GANs
Introduction To Generative Adversarial Networks GANsHichem Felouat
 
Predict future time series forecasting
Predict future time series forecastingPredict future time series forecasting
Predict future time series forecastingHichem Felouat
 
Natural Language Processing (NLP)
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)Hichem Felouat
 
TensorFlow in Your Browser
TensorFlow in Your BrowserTensorFlow in Your Browser
TensorFlow in Your BrowserOswald Campesato
 
Deep Learning in Python with Tensorflow for Finance
Deep Learning in Python with Tensorflow for FinanceDeep Learning in Python with Tensorflow for Finance
Deep Learning in Python with Tensorflow for FinanceBen Ball
 

Mais procurados (20)

Build your own Convolutional Neural Network CNN
Build your own Convolutional Neural Network CNNBuild your own Convolutional Neural Network CNN
Build your own Convolutional Neural Network CNN
 
Pytorch for tf_developers
Pytorch for tf_developersPytorch for tf_developers
Pytorch for tf_developers
 
PyTorch Tutorial for NTU Machine Learing Course 2017
PyTorch Tutorial for NTU Machine Learing Course 2017PyTorch Tutorial for NTU Machine Learing Course 2017
PyTorch Tutorial for NTU Machine Learing Course 2017
 
Machine Learning - Introduction to Tensorflow
Machine Learning - Introduction to TensorflowMachine Learning - Introduction to Tensorflow
Machine Learning - Introduction to Tensorflow
 
Overview of Chainer and Its Features
Overview of Chainer and Its FeaturesOverview of Chainer and Its Features
Overview of Chainer and Its Features
 
Alex Smola at AI Frontiers: Scalable Deep Learning Using MXNet
Alex Smola at AI Frontiers: Scalable Deep Learning Using MXNetAlex Smola at AI Frontiers: Scalable Deep Learning Using MXNet
Alex Smola at AI Frontiers: Scalable Deep Learning Using MXNet
 
Accelerating Random Forests in Scikit-Learn
Accelerating Random Forests in Scikit-LearnAccelerating Random Forests in Scikit-Learn
Accelerating Random Forests in Scikit-Learn
 
Introducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlowIntroducton to Convolutional Nerural Network with TensorFlow
Introducton to Convolutional Nerural Network with TensorFlow
 
Neural networks with python
Neural networks with pythonNeural networks with python
Neural networks with python
 
PyTorch for Deep Learning Practitioners
PyTorch for Deep Learning PractitionersPyTorch for Deep Learning Practitioners
PyTorch for Deep Learning Practitioners
 
Learning stochastic neural networks with Chainer
Learning stochastic neural networks with ChainerLearning stochastic neural networks with Chainer
Learning stochastic neural networks with Chainer
 
Scaling Deep Learning with MXNet
Scaling Deep Learning with MXNetScaling Deep Learning with MXNet
Scaling Deep Learning with MXNet
 
Machine Intelligence at Google Scale: TensorFlow
Machine Intelligence at Google Scale: TensorFlowMachine Intelligence at Google Scale: TensorFlow
Machine Intelligence at Google Scale: TensorFlow
 
Introduction to PyTorch
Introduction to PyTorchIntroduction to PyTorch
Introduction to PyTorch
 
Introduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at BerkeleyIntroduction to TensorFlow, by Machine Learning at Berkeley
Introduction to TensorFlow, by Machine Learning at Berkeley
 
Introduction To Generative Adversarial Networks GANs
Introduction To Generative Adversarial Networks GANsIntroduction To Generative Adversarial Networks GANs
Introduction To Generative Adversarial Networks GANs
 
Predict future time series forecasting
Predict future time series forecastingPredict future time series forecasting
Predict future time series forecasting
 
Natural Language Processing (NLP)
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)
 
TensorFlow in Your Browser
TensorFlow in Your BrowserTensorFlow in Your Browser
TensorFlow in Your Browser
 
Deep Learning in Python with Tensorflow for Finance
Deep Learning in Python with Tensorflow for FinanceDeep Learning in Python with Tensorflow for Finance
Deep Learning in Python with Tensorflow for Finance
 

Semelhante a Power ai tensorflowworkloadtutorial-20171117

OpenPOWER Workshop in Silicon Valley
OpenPOWER Workshop in Silicon ValleyOpenPOWER Workshop in Silicon Valley
OpenPOWER Workshop in Silicon ValleyGanesan Narayanasamy
 
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...Databricks
 
Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...
Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...
Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...ETS Asset Management Factory
 
maXbox starter65 machinelearning3
maXbox starter65 machinelearning3maXbox starter65 machinelearning3
maXbox starter65 machinelearning3Max Kleiner
 
[PR12] PR-036 Learning to Remember Rare Events
[PR12] PR-036 Learning to Remember Rare Events[PR12] PR-036 Learning to Remember Rare Events
[PR12] PR-036 Learning to Remember Rare EventsTaegyun Jeon
 
Scalable Deep Learning Using Apache MXNet
Scalable Deep Learning Using Apache MXNetScalable Deep Learning Using Apache MXNet
Scalable Deep Learning Using Apache MXNetAmazon Web Services
 
AIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdfAIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdfssuserb4d806
 
Final training course
Final training courseFinal training course
Final training courseNoor Dhiya
 
Language translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlowLanguage translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlowS N
 
Lecture1_computer vision-2023.pdf
Lecture1_computer vision-2023.pdfLecture1_computer vision-2023.pdf
Lecture1_computer vision-2023.pdfssuserff72e4
 
Introduction to Chainer
Introduction to ChainerIntroduction to Chainer
Introduction to ChainerSeiya Tokui
 
Deep Learning, Scala, and Spark
Deep Learning, Scala, and SparkDeep Learning, Scala, and Spark
Deep Learning, Scala, and SparkOswald Campesato
 
Google Big Data Expo
Google Big Data ExpoGoogle Big Data Expo
Google Big Data ExpoBigDataExpo
 
Anirudh Koul. 30 Golden Rules of Deep Learning Performance
Anirudh Koul. 30 Golden Rules of Deep Learning PerformanceAnirudh Koul. 30 Golden Rules of Deep Learning Performance
Anirudh Koul. 30 Golden Rules of Deep Learning PerformanceLviv Startup Club
 
Introduction To Using TensorFlow & Deep Learning
Introduction To Using TensorFlow & Deep LearningIntroduction To Using TensorFlow & Deep Learning
Introduction To Using TensorFlow & Deep Learningali alemi
 
Separating Hype from Reality in Deep Learning with Sameer Farooqui
 Separating Hype from Reality in Deep Learning with Sameer Farooqui Separating Hype from Reality in Deep Learning with Sameer Farooqui
Separating Hype from Reality in Deep Learning with Sameer FarooquiDatabricks
 
Viktor Tsykunov: Azure Machine Learning Service
Viktor Tsykunov: Azure Machine Learning ServiceViktor Tsykunov: Azure Machine Learning Service
Viktor Tsykunov: Azure Machine Learning ServiceLviv Startup Club
 

Semelhante a Power ai tensorflowworkloadtutorial-20171117 (20)

OpenPOWER Workshop in Silicon Valley
OpenPOWER Workshop in Silicon ValleyOpenPOWER Workshop in Silicon Valley
OpenPOWER Workshop in Silicon Valley
 
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...
 
Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...
Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...
Python + Tensorflow: how to earn money in the Stock Exchange with Deep Learni...
 
maXbox starter65 machinelearning3
maXbox starter65 machinelearning3maXbox starter65 machinelearning3
maXbox starter65 machinelearning3
 
[PR12] PR-036 Learning to Remember Rare Events
[PR12] PR-036 Learning to Remember Rare Events[PR12] PR-036 Learning to Remember Rare Events
[PR12] PR-036 Learning to Remember Rare Events
 
Scalable Deep Learning Using Apache MXNet
Scalable Deep Learning Using Apache MXNetScalable Deep Learning Using Apache MXNet
Scalable Deep Learning Using Apache MXNet
 
MXNet Workshop
MXNet WorkshopMXNet Workshop
MXNet Workshop
 
Deep learning-practical
Deep learning-practicalDeep learning-practical
Deep learning-practical
 
AIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdfAIML4 CNN lab256 1hr (111-1).pdf
AIML4 CNN lab256 1hr (111-1).pdf
 
Final training course
Final training courseFinal training course
Final training course
 
Language translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlowLanguage translation with Deep Learning (RNN) with TensorFlow
Language translation with Deep Learning (RNN) with TensorFlow
 
Deep Learning in theano
Deep Learning in theanoDeep Learning in theano
Deep Learning in theano
 
Lecture1_computer vision-2023.pdf
Lecture1_computer vision-2023.pdfLecture1_computer vision-2023.pdf
Lecture1_computer vision-2023.pdf
 
Introduction to Chainer
Introduction to ChainerIntroduction to Chainer
Introduction to Chainer
 
Deep Learning, Scala, and Spark
Deep Learning, Scala, and SparkDeep Learning, Scala, and Spark
Deep Learning, Scala, and Spark
 
Google Big Data Expo
Google Big Data ExpoGoogle Big Data Expo
Google Big Data Expo
 
Anirudh Koul. 30 Golden Rules of Deep Learning Performance
Anirudh Koul. 30 Golden Rules of Deep Learning PerformanceAnirudh Koul. 30 Golden Rules of Deep Learning Performance
Anirudh Koul. 30 Golden Rules of Deep Learning Performance
 
Introduction To Using TensorFlow & Deep Learning
Introduction To Using TensorFlow & Deep LearningIntroduction To Using TensorFlow & Deep Learning
Introduction To Using TensorFlow & Deep Learning
 
Separating Hype from Reality in Deep Learning with Sameer Farooqui
 Separating Hype from Reality in Deep Learning with Sameer Farooqui Separating Hype from Reality in Deep Learning with Sameer Farooqui
Separating Hype from Reality in Deep Learning with Sameer Farooqui
 
Viktor Tsykunov: Azure Machine Learning Service
Viktor Tsykunov: Azure Machine Learning ServiceViktor Tsykunov: Azure Machine Learning Service
Viktor Tsykunov: Azure Machine Learning Service
 

Mais de Ganesan Narayanasamy

Chip Design Curriculum development Residency program
Chip Design Curriculum development Residency programChip Design Curriculum development Residency program
Chip Design Curriculum development Residency programGanesan Narayanasamy
 
Basics of Digital Design and Verilog
Basics of Digital Design and VerilogBasics of Digital Design and Verilog
Basics of Digital Design and VerilogGanesan Narayanasamy
 
180 nm Tape out experience using Open POWER ISA
180 nm Tape out experience using Open POWER ISA180 nm Tape out experience using Open POWER ISA
180 nm Tape out experience using Open POWER ISAGanesan Narayanasamy
 
Workload Transformation and Innovations in POWER Architecture
Workload Transformation and Innovations in POWER Architecture Workload Transformation and Innovations in POWER Architecture
Workload Transformation and Innovations in POWER Architecture Ganesan Narayanasamy
 
Deep Learning Use Cases using OpenPOWER systems
Deep Learning Use Cases using OpenPOWER systemsDeep Learning Use Cases using OpenPOWER systems
Deep Learning Use Cases using OpenPOWER systemsGanesan Narayanasamy
 
OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...
OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...
OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...Ganesan Narayanasamy
 
AI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systems
AI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systemsAI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systems
AI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systemsGanesan Narayanasamy
 
AI in Health Care using IBM Systems/OpenPOWER systems
AI in Health Care using IBM Systems/OpenPOWER systemsAI in Health Care using IBM Systems/OpenPOWER systems
AI in Health Care using IBM Systems/OpenPOWER systemsGanesan Narayanasamy
 
AI in Healh Care using IBM POWER systems
AI in Healh Care using IBM POWER systems AI in Healh Care using IBM POWER systems
AI in Healh Care using IBM POWER systems Ganesan Narayanasamy
 
Graphical Structure Learning accelerated with POWER9
Graphical Structure Learning accelerated with POWER9Graphical Structure Learning accelerated with POWER9
Graphical Structure Learning accelerated with POWER9Ganesan Narayanasamy
 

Mais de Ganesan Narayanasamy (20)

Chip Design Curriculum development Residency program
Chip Design Curriculum development Residency programChip Design Curriculum development Residency program
Chip Design Curriculum development Residency program
 
Basics of Digital Design and Verilog
Basics of Digital Design and VerilogBasics of Digital Design and Verilog
Basics of Digital Design and Verilog
 
180 nm Tape out experience using Open POWER ISA
180 nm Tape out experience using Open POWER ISA180 nm Tape out experience using Open POWER ISA
180 nm Tape out experience using Open POWER ISA
 
Workload Transformation and Innovations in POWER Architecture
Workload Transformation and Innovations in POWER Architecture Workload Transformation and Innovations in POWER Architecture
Workload Transformation and Innovations in POWER Architecture
 
OpenPOWER Workshop at IIT Roorkee
OpenPOWER Workshop at IIT RoorkeeOpenPOWER Workshop at IIT Roorkee
OpenPOWER Workshop at IIT Roorkee
 
Deep Learning Use Cases using OpenPOWER systems
Deep Learning Use Cases using OpenPOWER systemsDeep Learning Use Cases using OpenPOWER systems
Deep Learning Use Cases using OpenPOWER systems
 
IBM BOA for POWER
IBM BOA for POWER IBM BOA for POWER
IBM BOA for POWER
 
OpenPOWER System Marconi100
OpenPOWER System Marconi100OpenPOWER System Marconi100
OpenPOWER System Marconi100
 
OpenPOWER Latest Updates
OpenPOWER Latest UpdatesOpenPOWER Latest Updates
OpenPOWER Latest Updates
 
POWER10 innovations for HPC
POWER10 innovations for HPCPOWER10 innovations for HPC
POWER10 innovations for HPC
 
Deeplearningusingcloudpakfordata
DeeplearningusingcloudpakfordataDeeplearningusingcloudpakfordata
Deeplearningusingcloudpakfordata
 
OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...
OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...
OpenCAPI-based Image Analysis Pipeline for 18 GB/s kilohertz-framerate X-ray ...
 
AI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systems
AI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systemsAI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systems
AI in healthcare and Automobile Industry using OpenPOWER/IBM POWER9 systems
 
AI in healthcare - Use Cases
AI in healthcare - Use Cases AI in healthcare - Use Cases
AI in healthcare - Use Cases
 
AI in Health Care using IBM Systems/OpenPOWER systems
AI in Health Care using IBM Systems/OpenPOWER systemsAI in Health Care using IBM Systems/OpenPOWER systems
AI in Health Care using IBM Systems/OpenPOWER systems
 
AI in Healh Care using IBM POWER systems
AI in Healh Care using IBM POWER systems AI in Healh Care using IBM POWER systems
AI in Healh Care using IBM POWER systems
 
Poster from NUS
Poster from NUSPoster from NUS
Poster from NUS
 
SAP HANA on POWER9 systems
SAP HANA on POWER9 systemsSAP HANA on POWER9 systems
SAP HANA on POWER9 systems
 
Graphical Structure Learning accelerated with POWER9
Graphical Structure Learning accelerated with POWER9Graphical Structure Learning accelerated with POWER9
Graphical Structure Learning accelerated with POWER9
 
AI in the enterprise
AI in the enterprise AI in the enterprise
AI in the enterprise
 

Último

Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...shambhavirathore45
 
Zuja dropshipping via API with DroFx.pptx
Zuja dropshipping via API with DroFx.pptxZuja dropshipping via API with DroFx.pptx
Zuja dropshipping via API with DroFx.pptxolyaivanovalion
 
Invezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz1
 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxMohammedJunaid861692
 
Ravak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptxRavak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptxolyaivanovalion
 
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...Delhi Call girls
 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxolyaivanovalion
 
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% SecureCall me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% SecurePooja Nehwal
 
Edukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFxEdukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFxolyaivanovalion
 
Introduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxIntroduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxfirstjob4
 
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...amitlee9823
 
Generative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and MilvusGenerative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and MilvusTimothy Spann
 
Midocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxMidocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxolyaivanovalion
 
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfMarket Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfRachmat Ramadhan H
 
April 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's AnalysisApril 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's Analysismanisha194592
 
100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptxAnupama Kate
 
Data-Analysis for Chicago Crime Data 2023
Data-Analysis for Chicago Crime Data  2023Data-Analysis for Chicago Crime Data  2023
Data-Analysis for Chicago Crime Data 2023ymrp368
 

Último (20)

Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...Determinants of health, dimensions of health, positive health and spectrum of...
Determinants of health, dimensions of health, positive health and spectrum of...
 
Zuja dropshipping via API with DroFx.pptx
Zuja dropshipping via API with DroFx.pptxZuja dropshipping via API with DroFx.pptx
Zuja dropshipping via API with DroFx.pptx
 
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in  KishangarhDelhi 99530 vip 56974 Genuine Escort Service Call Girls in  Kishangarh
Delhi 99530 vip 56974 Genuine Escort Service Call Girls in Kishangarh
 
Invezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signalsInvezz.com - Grow your wealth with trading signals
Invezz.com - Grow your wealth with trading signals
 
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptxBPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
BPAC WITH UFSBI GENERAL PRESENTATION 18_05_2017-1.pptx
 
Ravak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptxRavak dropshipping via API with DroFx.pptx
Ravak dropshipping via API with DroFx.pptx
 
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
Call Girls in Sarai Kale Khan Delhi 💯 Call Us 🔝9205541914 🔝( Delhi) Escorts S...
 
BigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptxBigBuy dropshipping via API with DroFx.pptx
BigBuy dropshipping via API with DroFx.pptx
 
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
(PARI) Call Girls Wanowrie ( 7001035870 ) HI-Fi Pune Escorts Service
 
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% SecureCall me @ 9892124323  Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
Call me @ 9892124323 Cheap Rate Call Girls in Vashi with Real Photo 100% Secure
 
Edukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFxEdukaciniai dropshipping via API with DroFx
Edukaciniai dropshipping via API with DroFx
 
Introduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptxIntroduction-to-Machine-Learning (1).pptx
Introduction-to-Machine-Learning (1).pptx
 
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
Junnasandra Call Girls: 🍓 7737669865 🍓 High Profile Model Escorts | Bangalore...
 
Generative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and MilvusGenerative AI on Enterprise Cloud with NiFi and Milvus
Generative AI on Enterprise Cloud with NiFi and Milvus
 
Midocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFxMidocean dropshipping via API with DroFx
Midocean dropshipping via API with DroFx
 
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Punjabi Bagh 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdfMarket Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
Market Analysis in the 5 Largest Economic Countries in Southeast Asia.pdf
 
April 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's AnalysisApril 2024 - Crypto Market Report's Analysis
April 2024 - Crypto Market Report's Analysis
 
100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx100-Concepts-of-AI by Anupama Kate .pptx
100-Concepts-of-AI by Anupama Kate .pptx
 
Data-Analysis for Chicago Crime Data 2023
Data-Analysis for Chicago Crime Data  2023Data-Analysis for Chicago Crime Data  2023
Data-Analysis for Chicago Crime Data 2023
 

Power ai tensorflowworkloadtutorial-20171117

  • 1. November 17, 2017 Catherine Diep, Peter Tan Running An AI Workload With IBM PowerAI
  • 2. Agenda Deep learning overview IBM PowerAI and deep learning TensorFlow frameworks Image Classification example with the MNIST Dataset Running the workload on IBM PowerAI 2
  • 3. Deep Learning basic operations  Map x -> y  Neural net is a graph  Data flow: left to right  Input(s) of the current cell are the output(s) from previous cells  Tweak all weights until output matches expected outcome 3 x y Deep Learning consists of algorithms that permit software to train itself— by exposing multilayered neural networks to vast amounts of data
  • 5. Training Flow • Continuously feed in input data to model, comparing output predictions and actual labels in order to compute loss. • An optimization algorithm is used to minimize this loss by tweaking the weights and biases of the model. • The model is progressively improved and predictions become more accurate as more data is fed. Compute (Run input through model) Compute loss (Compare output to label) Adjust weights and biases (Minimize loss) Output data (Predictions) Input data - Gradient Descent Algorithm - Adam Optimization Algorithm - … - Cross Entropy - Mean Squared Error - … 5
  • 6. PowerAI PowerAI is an enterprise software distribution of popular open-source deep learning frameworks • Enterprise ready SW distribution built on open source • Performance- for faster training times • Tools for ease of development • More information can be found at IBM Marketplace PowerAI Portal Visit the IBM developerWorks for more learning materials and demonstrations 6
  • 7. TensorFlow From Google  TensorFlow is an open source software library for numerical computation using data flow graphs  A framework for deep learning models  Open sourced November 2015  Re-designed for research and production  Rapid adoption • 5,500 github repo with Tensorflow in the title • Taught in universities classes (Stanford, Berkeley, Toronto, ....)  Fast growing community • 12K+ Q&A on stackoverflow  IBM support: • IBM PowerAI, Power System S822LC, record speed announced • IBM Data Science Experience • Watson Machine Learning Platform 7
  • 8. TensorFlow Graph in Python 1. Construct graph: use tf objects • Node: operation • Edge: data (tensor) 2. Execute graph: connect to runtime • Initialize variables • Load data, feed through graph • Train model: compute parameters, loss • Save checkpoints • Distribute workload 8 Runtime Traningdata prediction Compute, loss function
  • 9. MNIST Dataset 9 The MNIST (Modified National Institute of Standards and Technology) dataset consists of 60,000 images of handwritten digits like: Each image has an associated label denoting which digit it is. The above images would have labels 5, 0, 4, and 1.
  • 10. Problem Description: Image Classification 10 We want to be able to train a deep learning model using the MNIST dataset that will be able to look at images and predict what digits they are. 10
  • 12. Tensors for Regression 12  Placeholder: • 28x28 image flattened into a vector: [784]  Variables: • Weight is a 2D array: [784,10] • Bias is a vector: [10]  Prediction is simply: • y = x * weight + bias  Optimizer adjusts weight and bias to minimize loss (error) • Predicted y is close to label MatMul Add Softmax n x 784 784 x 10 10 X Y W b n x 10 CPU GPU Deploy with a session to run on CPU or GPU
  • 13. What is Softmax? 13  Normalized exponential function  Function that is good for assigning probabilities to an object being one of several things.  A softmax regression has two steps: • Add up the evidence of our input being in certain classes • Convert that evidence into probabilities  Sum of all outputs will be equal to 1.0  softmax(𝑦𝑖) = 𝑒 𝑦𝑖 𝑖=1 𝑛 𝑒 𝑦𝑖
  • 14. Implement Read data 14 from tensorflow.examples.tutorials.mnist import input_data import tensorflow as tf def main(): mnist = input_data.read_data_sets("MNIST_data/", one_hot=True) if __name__ == '__main__': main()  input_data is a utility function provided by TensorFlow to retrieve MNIST dataset  One_hot refers to how the labels will be represented: as one-hot vectors  One-hot vector is a vector which is 0 in most dimensions, and 1 in a single dimension.  E.g. 3 = [0, 0, 0, 1, 0, 0, 0, 0, 0, 0]
  • 15. Implement Placeholders 15  Placeholders are input  x is a 2D array for the images: • Each row is one flattened 28x28 image • First dimension is “None”, to be used to pull in a batch of images at a time (more later)  y_ is 2D array for the labels: • Second dimension 10 for the one-hot representation # Placeholder that will be fed image data. x = tf.placeholder(tf.float32, [None, 784]) # Placeholder that will be fed the correct labels. y_ = tf.placeholder(tf.float32, [None, 10]) 15
  • 16. Implement Weight and Bias 16  Weight and Bias are variables: to be tweaked during training • Weight is a 2D array: 784 x 10 • Bias is a vector: 10  Initialized with certain values: important for optimization algorithm def weight_variable(shape): """Generates a weight variable of a given shape.""" initial = tf.truncated_normal(shape, stddev=0.1) return tf.Variable(initial) def bias_variable(shape): """Generates a bias variable of a given shape.""" initial = tf.constant(0.1, shape=shape) return tf.Variable(initial) # Define weight and bias. W = weight_variable([784, 10]) b = bias_variable([10]) 16
  • 17. Implement Regression and Loss Optimizer 17  Neural network: Regression + Softmax  Loss function: how far off is the prediction from the label • cross_entropy = −1/𝑁 𝑖=1 𝑁 𝑦_𝑖 ∗ log(𝑦𝑖 ) where 𝑦_𝑖 = label, 𝑦𝑖 = predict  Optimizer algorithm: how to tweak the variables # Here we define our model which utilizes the softmax regression. y = tf.nn.softmax(tf.matmul(x, W) + b) # Define our loss. cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1])) # Define our optimizer. train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)
  • 18. Create Session connecting to Runtime 18  Create a session  You can connect to a runtime on a remote cluster for large scale training • Distributed Tensorflow  Different types of session: • Normal session to run full training • Interactive session for modifying neural network on the fly # Launch session. sess = tf.InteractiveSession() # Initialize variables. tf.global_variables_initializer().run() 18
  • 19. Train and Evaluate Model 19  Here, we run our training step 1100 times, feeding in batches of data to replace the placeholders  The batches are random data points we retrieve from our image training set  We then check the model with the test data to get our accuracy # Do the training. for i in range(1100): batch = mnist.train.next_batch(1) sess.run(train_step, feed_dict={x: batch[0], y_: batch[1]}) # See how model did. print("Test Accuracy %g" % sess.run(accuracy, feed_dict={ x:mnist.test.images, y_: mnist.test.labels}))
  • 20. Demo on a IBM PowerAI Trial server • IBM has partnered with Nimbix to provide cognitive developers a trial account that provides 24-hours of free processing time on the PowerAI platform • Go to the IBM Marketplace PowerAI Portal • Click the “Request trial” button • Follow the instruction provided to register and access your IBM PowerAI Trial environment • Demo on the trial server can be found at http://localhost:8888/tree/demo 20
  • 21. 21 1. Logon to Ubuntu as user “nimbix” with the password provided by Nimbix Bring up a terminal window, ssh to the nimbix server (i.e. NAE-165-254-189-20.jarvice.com) as nimbix user with provided password. => ssh nimbix@NAE-165-254-189-20.jarvice.com
  • 22. 22 2. Download and Install Anaconda # Download Miniconda => cd => wget -c https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-ppc64le.sh # Install Miniconda => cd => chmod 744 Miniconda3-latest-Linux-ppc64le.sh => ./Miniconda3-latest-Linux-ppc64le.sh ## and following the online instruction to finish the install. ## Answer “yes” to install location to .bashrc file # logoff the nimbix and log back to nimbix. Or, do the following command => source ~/.bashrc
  • 23. 23 3. Create an Miniconda environment with Python 2.7 => conda create -n image_cls python=2.7 4. Activate the Conda environment => source activate image_cls 5. Activate nvidia libraries => export PATH="/usr/lib/nvidia-361/bin:/usr/local/cuda-8.0/bin:$PATH" => export CUDA_HOME=/usr/local/cuda-8.0 => export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda-8.0/lib64
  • 24. 24 6. Prepare TensorFlow # Install numpy (Only need to run this step once) => pip install numpy # Activate tensorflow => source /opt/DL/tensorflow/bin/tensorflow-activate # Check to see if tensorflow is ready => pip list |grep tensorflow (image_cls) nimbix@JARVICENAE-0A0A1847:~$ pip list |grep tensorflow DEPRECATION: The default format will switch to columns in the future. You can use --format=(legacy|columns) (or define a format=(legacy|columns) in your pip.conf under the [list] section) to disable this warning. tensorflow (1.1.0) (image_cls) nimbix@JARVICENAE-0A0A1847:~$
  • 25. 25 7. Digital Image Classification workload Install. => cd # Download the digital image classification workload. => git clone https://github.com/pvaneck/tf_mnist 8. Training. => cd tf_mnist => python ./train_basic_model.py ## the training result will be saved in ~/tf_mnist/saved-model directory
  • 26. 26 9. Prediction # Predict the class of an image using the models saved in the ~/tf_mnist/saved-model directory with a sample image from ~/tf_mnist/sample-images directory => python ./classify_mnist.py sample-images/img_1.jpg # This is what img_1.jpg image looks like => # This is the program output: 2 (confidence = 0.99987) 3 (confidence = 0.00010) 0 (confidence = 0.00003) 8 (confidence = 0.00000) 5 (confidence = 0.00000) The result shows that the answer with the most confidence is “2”
  • 27. 27