Call Now ☎ 8264348440 !! Call Girls in Shahpur Jat Escort Service Delhi N.C.R.
DevOps and Machine Learning workflows with Azure
1. DevOps and Machine Learning
@vishwasnarayan5 Vishwas N
https://hacksterdude.web.app/
2. About me
I am a Podcaster : http://tiny.cc/vnrpodcast
I love to talking to techies
A Bibliophile
Passionate about Image dataset - Computer Vision
Now exploring Azure cloud
14. Sophisticated pretrained models
To simplify solution development
Azure
Databricks
Machine Learning
VMs
Popular frameworks
To build advanced deep learning solutions TensorFlow Keras
Pytorch Onnx
Azure
Machine Learning
Language Azure Search
Vision
On-premises Cloud Edge
Productive services
To empower data science and development teams
Powerful infrastructure
To accelerate deep learning
Machine Learning on Azure
Flexible deployment
To deploy and manage models on intelligent cloud and edge
16. Ask great questions Collect the data Prepare the data
Select the algorithm Train the
model
Deploy in Real World
The data science
process
17. Azure Machine Learning
A fully-managed cloud service that enables you to easily build, deploy, and share predictive analytics
solutions.
18. What is Azure Machine Learning?
Set of Azure Cloud Services Python SDK or R Programing
and many more
✔Prepare
Data
✔Build Models
✔Train Models
✔Manage Models
✔Track Experiments
✔Deploy Models
That enables you to:
22. Datasets – registered, known data sets
Experiments – Training runs
Pipelines – Training workflows
Models – Registered, versioned models
Endpoints:
Real-time Endpoints – Deployed model endpoints
Pipeline Endpoints – Training workflow endpoints
Compute – Managed compute
Environments – defined training and inference environments
Datastores – Connections to data
Azure Machine Learning
26. Azure Machine Learning Pipelines
Workflows of steps that can use Data
Sources, Datasets and Compute
targets
Unattended runs
Reusability
Tracking and versioning
27. Azure Pipelines
Orchestration for Continuous Integration
and Continuous Delivery
Gates, tasks and processes for quality
Integration with other services Trigger on
code and non-code events
28. Create a pipeline
Input Output
Runs a script
on a Compute Target
in a Docker container.
Parameters
29. Create a pipeline
Dataset Prepare data
Train the Model
with PyTorch
Processed
dataset
model
Register
the
model
Blob Storage Account Model Management
32. Jupyter Notebook
Compute Target
Docker Image
Hot Data store
1. Snapshot folder and send to
experiment
2. create docker image
3. Deploy docker
and snapshot to
compute
4. Mount datastore to
compute
6. Stream stdout,
logs, metrics
5. Launch the script
7. Copy over
outputs
Experiment
36. Code and comments only (not Jupyter output)
Plus every part of the pipeline
And Infrastructure and dependencies
And maybe a subset of data
Source Control
37. Everything should be in
source control!
Except your training data
which should be a known, shared data source
38. Triggered on code change
Refresh and execute AML Pipeline
Code quality, linting, and unit testing
Pull request process
Continuous Integration
41. Trigger on model registration
Deploy to test and staging environments
Run integration and load tests
Control: rollout, feature flags, A/B testing
Continuous Delivery