SlideShare uma empresa Scribd logo
1 de 20
Baixar para ler offline
Who are we?
Dhaval Jagani
Principal Technology Architect
Leads the open source database
practice within US for the
Modernization COE
Rajib Deb
Associate Vice President
Head of Architecture for
the Modernization COE
2
MongoDB…
Powering the new age data demands
3
Where are we seeing the usage of new age databases?
4
…and all of these keeping in view of the volume, velocity and variety of data
Building
highly available
applications
Enabling
Event-driven
architectures
Implement
distributed
query capability
MongoDB has been solving many of these new age
demands
Let us take a peek into some of these real world use cases
5
Client Scenario # 1
6
1
Who is the client?
Client is an American multinational technology conglomerate. They have embarked on a project to
automate one of their business processes to enhance customer experience and reduce customer churn.
The existing process is manual and touches multiple systems and applications across various lines of
business. A project was initiated to design a target architecture that will completely automate the process
and reduce the elapsed time from month to minutes.
What were we trying to solve?
The objective of this project was to design and implement a solution that will seamlessly integrate with
heterogeneous applications thereby reducing or eliminating manual touch points. The target architecture
was designed following an event driven architecture with Kafka as the integration broker. It was required
to develop a mechanism to have a real-time view on the progress of the transaction as it moves from one
stage to the other.
How is MongoDB leveraged in this use case?
7
1
A transaction is complete when it has gone through all the stages as shown
here. It was important to have a visibility into the process and also there was
a need to recreate the entire transaction in case of a failure.
Each time a stage event used to occur, that event gets also pushed to
MongoDB which forces a refresh on Angular. To persist the transaction
state, one collection has been created with the below data model
{
“transcationId”:”12345”,
“currentState”:” Quote Creation”,
“noOfCompletedStage”:3
“completedStages”: [
{“stageName”:” Client Onboarding”,
“stageStartTime”:” 19:08:00”,
” stageEndTime”:” 19:10:00”},
{“stageName”:” Opportunity Creation”,
“stageStartTime”:” 19:11:00”,
”stageEndTime”:” 19:12:00”},
{“stageName”:” Quote Creation”,
“stageStartTime”:” 19:14:00”,
” stageEndTime”:” 19:15:00”}]
}
Architecture
What were the salient features of MongoDB used in the use case?
8
1
Schema Free
Design
• The framework need to support multiple business process
• Each of them may have a different payload to process
Aggregation
framework
• Enable useful insights and optimization recommendations
• Ability to aggregate the transactions by multiple dimensions
Change
Streams
• Enable auto refresh of the transaction monitoring dashboard
• Any change or modification to the transaction must push it to the dashboard
Client Scenario # 2
9
2
Who is the client?
Client is an American multinational technology conglomerate. Their current return merchandise process is
manual and error prone. If a customer returns a defective item, it takes manual effort and time to
determine whether the item should be fixed or a new item needs to be shipped
What were we trying to solve?
The objective of this project was to design a framework that will orchestrate the entire return merchandise
process. The orchestration metadata must be abstracted from the process so that it can support any
change in the process with minimal changes. The orchestration metadata store was chosen as MongoDB
How is MongoDB leveraged in this use case?
10
2
{
“eventName”:XXXX,
“eventType” : Action
“eventAPI”:
{ “endpoint”:xxxx
“oauthpurl”:xxxx
“payload”:XXXX
}
}
{
“eventName”:XXXX,
“sourceTopic”:XXXX,
“targetTopic” :XXXX
}
Event 1 Event 2 Event n
Source
topic
lookup
Processing
Layer
Sink
Topic
Source
topic
lookup
Processing
Layer
Sink
Topic
Source
topic
lookup
Processing
Layer
Sink
Topic
Metadata Cache
Client Scenario # 3
11
3
Who is the client?
Client is an American Healthcare retail giant. They have embarked on a project to improve the productivity
of their store employees. As a part of a day’s schedule of the store employee, they have to execute many
workflows. A project was initiated to design a target architecture that will automate part of their workflow
steps with the help of Artificial Intelligence and Machine Learning.
What were we trying to solve?
Objective of this project is to design and implement a solution that would integrate with machine learning
algorithms that learn from historical data. The machine learning algorithms would execute not only of data
being sent on the transactional data but would also need reference and master data. As the algorithm
would retrain and mature over time, there would be need for additional reference and master data. This
would mandate a need of flexible data model for different master and reference data entities.
How is MongoDB leveraged in this use case?
Target State Architecture
12
3
A document based data model was
a perfect choice to solve this use
case and MongoDB was chosen
as the source of reference to
render this master and reference
data entities to the Machine
learning algorithms.
It is also a perfect choice to log the
output of the different algorithms
and orchestration engine. This data
would be used for transaction
processing, operational reporting
and production support.
What were the salient features of MongoDB used in the use case?
13
3
{“_id”: {“patient_id” : Number},
“demographics”: { “dob” : String
“gender”: String
“zip_code”: string},
…
}
As more data attributes are needed for Machine
learning algorithms, the schema free nature of
MongoDB provides support for rapid development and
deployment of changes
Support for 250+ TPS across 50000 Terminals that support the store employee workflow. This is
supported by the Active-Active deployment of the application.
MongoDB sharding and replication across data centers supports the workflow. Reads from secondary to
support performance and High Availability requirements
Multiple processes are supported by Change Streams
-Chain State Store level Settings and Roll out of Machine learning algorithms
-Audit and event functions
Operational Data to be persisted for 10 days and then the data gets purged.
TTL Index defined to be only present for 10 days.
Schema Free
Design
Sharding and
Replication to
achieve HA
Change
Streams
Purge through
TTL
Client Scenario # 4
14
4
Who is the client?
Client is major international online e-commerce platform provider. They are planning multi-year journey
with multiple strategic objectives, 1. Replacing monolithic application using micro-services architecture 2.
Considerations of European & Asian legal data residency requirements 3. Respond to highly elastic
business demand on a short notice 4. Products quick to market
What were we trying to solve?
Part of the objective was to use right data storage technology and architecture to deliver a global scale
application, converting services from a legacy application using distributed computing capable of rapid
development, elastic capacity, data caging, high performance & fault tolerant.
How is MongoDB leveraged in this use case?
Architecture
15
4
MongoDB was the right choice with native
capabilities like quick data modelling,
shard tags for data residency, high
performance and high availability, easy to
add/remove capacity.
MongoDB is also one of most matured
data storage option in NoSQL/Document
database domain for machine critical
applications and a reliable partner for
enterprise strategy towards open source
adoption.
What were the salient features of MongoDB used in the use case?
16
4
{
name: "Midhuna",
spouse: {
name: "Akash",
age: 25
}
}
Flexible schemas are effective to use with micro-services
architecture, quick data models using complex data types,
rapid development, easy adoption for development and
design changes.
Data model –
Quick to market
Supports a hybrid load of 5000 writes and 15000 reads per second. Data reads, writes can be highly
distributes using data-sharding and replication features. Individual queries can be tuned using indexes
based on data access patterns.
High Performance
& HA
Using data-sharding and shard tags, data storage can be ensured at a specified location. This
option can significantly help with country specific legal requirements and helps with performance by
colocation of data with the users.
Data
Residency
Ops Manager is very sophisticated tool from MongoDB, improves ease & efficiency of building and
managing clusters significantly. In additional to build & operate clusters, Ops Manager also provides all
administration capabilities to monitoring/notification, backups/recovery, auto failovers, performance
metrics visibility etc.
Easier to build
and operate
Client Scenario # 5
17
5
Who is the client?
Client is American insurance firm. They decided to embark on a new journey to modernize customer’s IT
applications & infrastructure design. The target architecture will allow new businesses and vendors to
collaborate on centralized web portal. The new system will be moving away from the existing Oracle
environment and will be capable of handling unstructured data streams from various agents and third
party businesses.
What were we trying to solve?
Objective of this project is to maintain insurance information whose effective dates are in the past,
present, or future. The system also need to turn back the clock and see agreement data as it was at a
specific time in the past. This is useful for retrieving the historical agreement documents, for meeting
statutory data retention requirements, and for understanding how your data changes as it moves from
creation through obsolescence.
How is MongoDB leveraged in this use case?
Target State Architecture
18
A document store was a perfect choice
for this requirement and MongoDB was
chosen as it offers full ACID transaction
support.
The Persistence Engine will be designed
to have the maximum flexibility to handle
transactions for current and future
business needs and implement
separation of abstract data transactions
from the business logic. The bi-temporal
data transactions need to be transparent
to the application without complicating
the development and maintenance.
5
Schema Less
Design
Data
transactions
Scaling through
Sharding
What were the salient features of MongoDB used in the use case?
19
The insurance data is inherently rich data with a wide variety of ever-changing data attributes. MongoDB
is the leading document-store, which is designed to offer a rich experience with support for modern
programming languages and rapid development techniques.
MongoDB is the only database that fully combines the power of the document model and a distributed
systems architecture with ACID guarantees. Through snapshot isolation, transactions provide a consistent
view of data, and enforce all-or-nothing execution to maintain data integrity, even across sharded clusters.
All State deals with volumes of data that requires handling ever-changing insurance information at scales.
MongoDB dataset can linearly scale across multiple servers without the need of knowing when the
physical horizontal scale comes into play.
5
Document
Oriented DB
MongoDB is the most mature document-oriented database in the database industry today. MongoDB
inherently offers all the features and techniques suitable for storing and managing big data-sized
collections of literal documents like text documents, email messages, XML documents, etc.
© 2019 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys
acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this
documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the
prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document.
THANK YOU

Mais conteúdo relacionado

Mais procurados

Business Track: How MongoDB Helps Telefonia Digital Accelerate Time to Market
Business Track: How MongoDB Helps Telefonia Digital Accelerate Time to MarketBusiness Track: How MongoDB Helps Telefonia Digital Accelerate Time to Market
Business Track: How MongoDB Helps Telefonia Digital Accelerate Time to Market
MongoDB
 
Event-Based Subscription with MongoDB
Event-Based Subscription with MongoDBEvent-Based Subscription with MongoDB
Event-Based Subscription with MongoDB
MongoDB
 

Mais procurados (20)

MongoDB .local Munich 2019: Mastering MongoDB on Kubernetes – MongoDB Enterpr...
MongoDB .local Munich 2019: Mastering MongoDB on Kubernetes – MongoDB Enterpr...MongoDB .local Munich 2019: Mastering MongoDB on Kubernetes – MongoDB Enterpr...
MongoDB .local Munich 2019: Mastering MongoDB on Kubernetes – MongoDB Enterpr...
 
Unlocking Operational Intelligence from the Data Lake
Unlocking Operational Intelligence from the Data LakeUnlocking Operational Intelligence from the Data Lake
Unlocking Operational Intelligence from the Data Lake
 
MongoDB in a Mainframe World
MongoDB in a Mainframe WorldMongoDB in a Mainframe World
MongoDB in a Mainframe World
 
Accelerating a Path to Digital with a Cloud Data Strategy
Accelerating a Path to Digital with a Cloud Data StrategyAccelerating a Path to Digital with a Cloud Data Strategy
Accelerating a Path to Digital with a Cloud Data Strategy
 
MongoDB: Agile Combustion Engine
MongoDB: Agile Combustion EngineMongoDB: Agile Combustion Engine
MongoDB: Agile Combustion Engine
 
Business Track: How MongoDB Helps Telefonia Digital Accelerate Time to Market
Business Track: How MongoDB Helps Telefonia Digital Accelerate Time to MarketBusiness Track: How MongoDB Helps Telefonia Digital Accelerate Time to Market
Business Track: How MongoDB Helps Telefonia Digital Accelerate Time to Market
 
MongoDB .local Chicago 2019: MongoDB Atlas Data Lake Technical Deep Dive
MongoDB .local Chicago 2019: MongoDB Atlas Data Lake Technical Deep DiveMongoDB .local Chicago 2019: MongoDB Atlas Data Lake Technical Deep Dive
MongoDB .local Chicago 2019: MongoDB Atlas Data Lake Technical Deep Dive
 
Overcoming Today's Data Challenges with MongoDB
Overcoming Today's Data Challenges with MongoDBOvercoming Today's Data Challenges with MongoDB
Overcoming Today's Data Challenges with MongoDB
 
[MongoDB.local Bengaluru 2018] Jumpstart: Introduction to Schema Design
[MongoDB.local Bengaluru 2018] Jumpstart: Introduction to Schema Design[MongoDB.local Bengaluru 2018] Jumpstart: Introduction to Schema Design
[MongoDB.local Bengaluru 2018] Jumpstart: Introduction to Schema Design
 
Webinar: Elevate Your Enterprise Architecture with In-Memory Computing
Webinar: Elevate Your Enterprise Architecture with In-Memory ComputingWebinar: Elevate Your Enterprise Architecture with In-Memory Computing
Webinar: Elevate Your Enterprise Architecture with In-Memory Computing
 
How to deliver a Single View in Financial Services
 How to deliver a Single View in Financial Services How to deliver a Single View in Financial Services
How to deliver a Single View in Financial Services
 
MongoDB .local Munich 2019: MongoDB Atlas Data Lake Technical Deep Dive
MongoDB .local Munich 2019: MongoDB Atlas Data Lake Technical Deep DiveMongoDB .local Munich 2019: MongoDB Atlas Data Lake Technical Deep Dive
MongoDB .local Munich 2019: MongoDB Atlas Data Lake Technical Deep Dive
 
MongoDB .local Paris 2020: Les bonnes pratiques pour travailler avec les donn...
MongoDB .local Paris 2020: Les bonnes pratiques pour travailler avec les donn...MongoDB .local Paris 2020: Les bonnes pratiques pour travailler avec les donn...
MongoDB .local Paris 2020: Les bonnes pratiques pour travailler avec les donn...
 
MongoDB company and case studies - john hong
MongoDB company and case studies - john hong MongoDB company and case studies - john hong
MongoDB company and case studies - john hong
 
Secure, Low Latency with MongoDB
Secure, Low Latency with MongoDBSecure, Low Latency with MongoDB
Secure, Low Latency with MongoDB
 
Event-Based Subscription with MongoDB
Event-Based Subscription with MongoDBEvent-Based Subscription with MongoDB
Event-Based Subscription with MongoDB
 
Addressing Your Backup Needs Using Ops Manager and Atlas
Addressing Your Backup Needs Using Ops Manager and AtlasAddressing Your Backup Needs Using Ops Manager and Atlas
Addressing Your Backup Needs Using Ops Manager and Atlas
 
Webinar: MongoDB and Analytics: Building Solutions with the MongoDB BI Connector
Webinar: MongoDB and Analytics: Building Solutions with the MongoDB BI ConnectorWebinar: MongoDB and Analytics: Building Solutions with the MongoDB BI Connector
Webinar: MongoDB and Analytics: Building Solutions with the MongoDB BI Connector
 
How Retail Banks Use MongoDB
How Retail Banks Use MongoDBHow Retail Banks Use MongoDB
How Retail Banks Use MongoDB
 
MongoDB in the Big Data Landscape
MongoDB in the Big Data LandscapeMongoDB in the Big Data Landscape
MongoDB in the Big Data Landscape
 

Semelhante a MongoDB .local Chicago 2019: MongoDB – Powering the new age data demands

MongoDB Breakfast Milan - Mainframe Offloading Strategies
MongoDB Breakfast Milan -  Mainframe Offloading StrategiesMongoDB Breakfast Milan -  Mainframe Offloading Strategies
MongoDB Breakfast Milan - Mainframe Offloading Strategies
MongoDB
 
GERSIS INDUSTRY CASES
GERSIS INDUSTRY CASESGERSIS INDUSTRY CASES
GERSIS INDUSTRY CASES
Sergej Markov
 
Accelerating a Path to Digital With a Cloud Data Strategy
Accelerating a Path to Digital With a Cloud Data StrategyAccelerating a Path to Digital With a Cloud Data Strategy
Accelerating a Path to Digital With a Cloud Data Strategy
MongoDB
 
How to Migrate Applications Off a Mainframe
How to Migrate Applications Off a MainframeHow to Migrate Applications Off a Mainframe
How to Migrate Applications Off a Mainframe
VMware Tanzu
 
Running Data Platforms Like Products
Running Data Platforms Like ProductsRunning Data Platforms Like Products
Running Data Platforms Like Products
VMware Tanzu
 
Ramachandra_Reddy_Resume_2015
Ramachandra_Reddy_Resume_2015Ramachandra_Reddy_Resume_2015
Ramachandra_Reddy_Resume_2015
Ramchandra Reddy
 

Semelhante a MongoDB .local Chicago 2019: MongoDB – Powering the new age data demands (20)

Webinar: Faster Big Data Analytics with MongoDB
Webinar: Faster Big Data Analytics with MongoDBWebinar: Faster Big Data Analytics with MongoDB
Webinar: Faster Big Data Analytics with MongoDB
 
Enterprise architectsview 2015-apr
Enterprise architectsview 2015-aprEnterprise architectsview 2015-apr
Enterprise architectsview 2015-apr
 
Gain Deep Visibility into APIs and Integrations with Anypoint Monitoring
Gain Deep Visibility into APIs and Integrations with Anypoint MonitoringGain Deep Visibility into APIs and Integrations with Anypoint Monitoring
Gain Deep Visibility into APIs and Integrations with Anypoint Monitoring
 
MongoDB Breakfast Milan - Mainframe Offloading Strategies
MongoDB Breakfast Milan -  Mainframe Offloading StrategiesMongoDB Breakfast Milan -  Mainframe Offloading Strategies
MongoDB Breakfast Milan - Mainframe Offloading Strategies
 
Resume
ResumeResume
Resume
 
Applying BigQuery ML on e-commerce data analytics
Applying BigQuery ML on e-commerce data analyticsApplying BigQuery ML on e-commerce data analytics
Applying BigQuery ML on e-commerce data analytics
 
The Double win business transformation and in-year ROI and TCO reduction
The Double win business transformation and in-year ROI and TCO reductionThe Double win business transformation and in-year ROI and TCO reduction
The Double win business transformation and in-year ROI and TCO reduction
 
How a Time Series Database Contributes to a Decentralized Cloud Object Storag...
How a Time Series Database Contributes to a Decentralized Cloud Object Storag...How a Time Series Database Contributes to a Decentralized Cloud Object Storag...
How a Time Series Database Contributes to a Decentralized Cloud Object Storag...
 
GERSIS INDUSTRY CASES
GERSIS INDUSTRY CASESGERSIS INDUSTRY CASES
GERSIS INDUSTRY CASES
 
Azure Biz
Azure BizAzure Biz
Azure Biz
 
Flink Forward Berlin 2017 Keynote: Ferd Scheepers - Taking away customer fric...
Flink Forward Berlin 2017 Keynote: Ferd Scheepers - Taking away customer fric...Flink Forward Berlin 2017 Keynote: Ferd Scheepers - Taking away customer fric...
Flink Forward Berlin 2017 Keynote: Ferd Scheepers - Taking away customer fric...
 
Accelerating a Path to Digital With a Cloud Data Strategy
Accelerating a Path to Digital With a Cloud Data StrategyAccelerating a Path to Digital With a Cloud Data Strategy
Accelerating a Path to Digital With a Cloud Data Strategy
 
Harish software engineer (rpa) 4+ yrs exp
Harish software engineer (rpa) 4+ yrs expHarish software engineer (rpa) 4+ yrs exp
Harish software engineer (rpa) 4+ yrs exp
 
Schnellere Digitalisierung mit einer cloudbasierten Datenstrategie
Schnellere Digitalisierung mit einer cloudbasierten DatenstrategieSchnellere Digitalisierung mit einer cloudbasierten Datenstrategie
Schnellere Digitalisierung mit einer cloudbasierten Datenstrategie
 
How to Migrate Applications Off a Mainframe
How to Migrate Applications Off a MainframeHow to Migrate Applications Off a Mainframe
How to Migrate Applications Off a Mainframe
 
Running Data Platforms Like Products
Running Data Platforms Like ProductsRunning Data Platforms Like Products
Running Data Platforms Like Products
 
Ramachandra_Reddy_Resume_2015
Ramachandra_Reddy_Resume_2015Ramachandra_Reddy_Resume_2015
Ramachandra_Reddy_Resume_2015
 
AI as a Service, Build Shared AI Service Platforms Based on Deep Learning Tec...
AI as a Service, Build Shared AI Service Platforms Based on Deep Learning Tec...AI as a Service, Build Shared AI Service Platforms Based on Deep Learning Tec...
AI as a Service, Build Shared AI Service Platforms Based on Deep Learning Tec...
 
Emvigo Data Visualization - E Commerce Deck
Emvigo Data Visualization - E Commerce DeckEmvigo Data Visualization - E Commerce Deck
Emvigo Data Visualization - E Commerce Deck
 
Ahmed El Mawaziny CV
Ahmed El Mawaziny CVAhmed El Mawaziny CV
Ahmed El Mawaziny CV
 

Mais de MongoDB

Mais de MongoDB (20)

MongoDB SoCal 2020: Migrate Anything* to MongoDB Atlas
MongoDB SoCal 2020: Migrate Anything* to MongoDB AtlasMongoDB SoCal 2020: Migrate Anything* to MongoDB Atlas
MongoDB SoCal 2020: Migrate Anything* to MongoDB Atlas
 
MongoDB SoCal 2020: Go on a Data Safari with MongoDB Charts!
MongoDB SoCal 2020: Go on a Data Safari with MongoDB Charts!MongoDB SoCal 2020: Go on a Data Safari with MongoDB Charts!
MongoDB SoCal 2020: Go on a Data Safari with MongoDB Charts!
 
MongoDB SoCal 2020: Using MongoDB Services in Kubernetes: Any Platform, Devel...
MongoDB SoCal 2020: Using MongoDB Services in Kubernetes: Any Platform, Devel...MongoDB SoCal 2020: Using MongoDB Services in Kubernetes: Any Platform, Devel...
MongoDB SoCal 2020: Using MongoDB Services in Kubernetes: Any Platform, Devel...
 
MongoDB SoCal 2020: A Complete Methodology of Data Modeling for MongoDB
MongoDB SoCal 2020: A Complete Methodology of Data Modeling for MongoDBMongoDB SoCal 2020: A Complete Methodology of Data Modeling for MongoDB
MongoDB SoCal 2020: A Complete Methodology of Data Modeling for MongoDB
 
MongoDB SoCal 2020: From Pharmacist to Analyst: Leveraging MongoDB for Real-T...
MongoDB SoCal 2020: From Pharmacist to Analyst: Leveraging MongoDB for Real-T...MongoDB SoCal 2020: From Pharmacist to Analyst: Leveraging MongoDB for Real-T...
MongoDB SoCal 2020: From Pharmacist to Analyst: Leveraging MongoDB for Real-T...
 
MongoDB SoCal 2020: Best Practices for Working with IoT and Time-series Data
MongoDB SoCal 2020: Best Practices for Working with IoT and Time-series DataMongoDB SoCal 2020: Best Practices for Working with IoT and Time-series Data
MongoDB SoCal 2020: Best Practices for Working with IoT and Time-series Data
 
MongoDB SoCal 2020: MongoDB Atlas Jump Start
 MongoDB SoCal 2020: MongoDB Atlas Jump Start MongoDB SoCal 2020: MongoDB Atlas Jump Start
MongoDB SoCal 2020: MongoDB Atlas Jump Start
 
MongoDB .local San Francisco 2020: Powering the new age data demands [Infosys]
MongoDB .local San Francisco 2020: Powering the new age data demands [Infosys]MongoDB .local San Francisco 2020: Powering the new age data demands [Infosys]
MongoDB .local San Francisco 2020: Powering the new age data demands [Infosys]
 
MongoDB .local San Francisco 2020: Using Client Side Encryption in MongoDB 4.2
MongoDB .local San Francisco 2020: Using Client Side Encryption in MongoDB 4.2MongoDB .local San Francisco 2020: Using Client Side Encryption in MongoDB 4.2
MongoDB .local San Francisco 2020: Using Client Side Encryption in MongoDB 4.2
 
MongoDB .local San Francisco 2020: Using MongoDB Services in Kubernetes: any ...
MongoDB .local San Francisco 2020: Using MongoDB Services in Kubernetes: any ...MongoDB .local San Francisco 2020: Using MongoDB Services in Kubernetes: any ...
MongoDB .local San Francisco 2020: Using MongoDB Services in Kubernetes: any ...
 
MongoDB .local San Francisco 2020: Go on a Data Safari with MongoDB Charts!
MongoDB .local San Francisco 2020: Go on a Data Safari with MongoDB Charts!MongoDB .local San Francisco 2020: Go on a Data Safari with MongoDB Charts!
MongoDB .local San Francisco 2020: Go on a Data Safari with MongoDB Charts!
 
MongoDB .local San Francisco 2020: From SQL to NoSQL -- Changing Your Mindset
MongoDB .local San Francisco 2020: From SQL to NoSQL -- Changing Your MindsetMongoDB .local San Francisco 2020: From SQL to NoSQL -- Changing Your Mindset
MongoDB .local San Francisco 2020: From SQL to NoSQL -- Changing Your Mindset
 
MongoDB .local San Francisco 2020: MongoDB Atlas Jumpstart
MongoDB .local San Francisco 2020: MongoDB Atlas JumpstartMongoDB .local San Francisco 2020: MongoDB Atlas Jumpstart
MongoDB .local San Francisco 2020: MongoDB Atlas Jumpstart
 
MongoDB .local San Francisco 2020: Tips and Tricks++ for Querying and Indexin...
MongoDB .local San Francisco 2020: Tips and Tricks++ for Querying and Indexin...MongoDB .local San Francisco 2020: Tips and Tricks++ for Querying and Indexin...
MongoDB .local San Francisco 2020: Tips and Tricks++ for Querying and Indexin...
 
MongoDB .local San Francisco 2020: Aggregation Pipeline Power++
MongoDB .local San Francisco 2020: Aggregation Pipeline Power++MongoDB .local San Francisco 2020: Aggregation Pipeline Power++
MongoDB .local San Francisco 2020: Aggregation Pipeline Power++
 
MongoDB .local San Francisco 2020: A Complete Methodology of Data Modeling fo...
MongoDB .local San Francisco 2020: A Complete Methodology of Data Modeling fo...MongoDB .local San Francisco 2020: A Complete Methodology of Data Modeling fo...
MongoDB .local San Francisco 2020: A Complete Methodology of Data Modeling fo...
 
MongoDB .local San Francisco 2020: MongoDB Atlas Data Lake Technical Deep Dive
MongoDB .local San Francisco 2020: MongoDB Atlas Data Lake Technical Deep DiveMongoDB .local San Francisco 2020: MongoDB Atlas Data Lake Technical Deep Dive
MongoDB .local San Francisco 2020: MongoDB Atlas Data Lake Technical Deep Dive
 
MongoDB .local San Francisco 2020: Developing Alexa Skills with MongoDB & Golang
MongoDB .local San Francisco 2020: Developing Alexa Skills with MongoDB & GolangMongoDB .local San Francisco 2020: Developing Alexa Skills with MongoDB & Golang
MongoDB .local San Francisco 2020: Developing Alexa Skills with MongoDB & Golang
 
MongoDB .local Paris 2020: Realm : l'ingrédient secret pour de meilleures app...
MongoDB .local Paris 2020: Realm : l'ingrédient secret pour de meilleures app...MongoDB .local Paris 2020: Realm : l'ingrédient secret pour de meilleures app...
MongoDB .local Paris 2020: Realm : l'ingrédient secret pour de meilleures app...
 
MongoDB .local Paris 2020: Upply @MongoDB : Upply : Quand le Machine Learning...
MongoDB .local Paris 2020: Upply @MongoDB : Upply : Quand le Machine Learning...MongoDB .local Paris 2020: Upply @MongoDB : Upply : Quand le Machine Learning...
MongoDB .local Paris 2020: Upply @MongoDB : Upply : Quand le Machine Learning...
 

Último

Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Último (20)

Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, AdobeApidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
Apidays New York 2024 - Scaling API-first by Ian Reasor and Radu Cotescu, Adobe
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)Powerful Google developer tools for immediate impact! (2023-24 C)
Powerful Google developer tools for immediate impact! (2023-24 C)
 
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
Apidays New York 2024 - The Good, the Bad and the Governed by David O'Neill, ...
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024Partners Life - Insurer Innovation Award 2024
Partners Life - Insurer Innovation Award 2024
 
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time AutomationFrom Event to Action: Accelerate Your Decision Making with Real-Time Automation
From Event to Action: Accelerate Your Decision Making with Real-Time Automation
 
MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024MINDCTI Revenue Release Quarter One 2024
MINDCTI Revenue Release Quarter One 2024
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
Bajaj Allianz Life Insurance Company - Insurer Innovation Award 2024
 
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdfUnderstanding Discord NSFW Servers A Guide for Responsible Users.pdf
Understanding Discord NSFW Servers A Guide for Responsible Users.pdf
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Artificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : UncertaintyArtificial Intelligence Chap.5 : Uncertainty
Artificial Intelligence Chap.5 : Uncertainty
 

MongoDB .local Chicago 2019: MongoDB – Powering the new age data demands

  • 1.
  • 2. Who are we? Dhaval Jagani Principal Technology Architect Leads the open source database practice within US for the Modernization COE Rajib Deb Associate Vice President Head of Architecture for the Modernization COE 2
  • 3. MongoDB… Powering the new age data demands 3
  • 4. Where are we seeing the usage of new age databases? 4 …and all of these keeping in view of the volume, velocity and variety of data Building highly available applications Enabling Event-driven architectures Implement distributed query capability
  • 5. MongoDB has been solving many of these new age demands Let us take a peek into some of these real world use cases 5
  • 6. Client Scenario # 1 6 1 Who is the client? Client is an American multinational technology conglomerate. They have embarked on a project to automate one of their business processes to enhance customer experience and reduce customer churn. The existing process is manual and touches multiple systems and applications across various lines of business. A project was initiated to design a target architecture that will completely automate the process and reduce the elapsed time from month to minutes. What were we trying to solve? The objective of this project was to design and implement a solution that will seamlessly integrate with heterogeneous applications thereby reducing or eliminating manual touch points. The target architecture was designed following an event driven architecture with Kafka as the integration broker. It was required to develop a mechanism to have a real-time view on the progress of the transaction as it moves from one stage to the other.
  • 7. How is MongoDB leveraged in this use case? 7 1 A transaction is complete when it has gone through all the stages as shown here. It was important to have a visibility into the process and also there was a need to recreate the entire transaction in case of a failure. Each time a stage event used to occur, that event gets also pushed to MongoDB which forces a refresh on Angular. To persist the transaction state, one collection has been created with the below data model { “transcationId”:”12345”, “currentState”:” Quote Creation”, “noOfCompletedStage”:3 “completedStages”: [ {“stageName”:” Client Onboarding”, “stageStartTime”:” 19:08:00”, ” stageEndTime”:” 19:10:00”}, {“stageName”:” Opportunity Creation”, “stageStartTime”:” 19:11:00”, ”stageEndTime”:” 19:12:00”}, {“stageName”:” Quote Creation”, “stageStartTime”:” 19:14:00”, ” stageEndTime”:” 19:15:00”}] } Architecture
  • 8. What were the salient features of MongoDB used in the use case? 8 1 Schema Free Design • The framework need to support multiple business process • Each of them may have a different payload to process Aggregation framework • Enable useful insights and optimization recommendations • Ability to aggregate the transactions by multiple dimensions Change Streams • Enable auto refresh of the transaction monitoring dashboard • Any change or modification to the transaction must push it to the dashboard
  • 9. Client Scenario # 2 9 2 Who is the client? Client is an American multinational technology conglomerate. Their current return merchandise process is manual and error prone. If a customer returns a defective item, it takes manual effort and time to determine whether the item should be fixed or a new item needs to be shipped What were we trying to solve? The objective of this project was to design a framework that will orchestrate the entire return merchandise process. The orchestration metadata must be abstracted from the process so that it can support any change in the process with minimal changes. The orchestration metadata store was chosen as MongoDB
  • 10. How is MongoDB leveraged in this use case? 10 2 { “eventName”:XXXX, “eventType” : Action “eventAPI”: { “endpoint”:xxxx “oauthpurl”:xxxx “payload”:XXXX } } { “eventName”:XXXX, “sourceTopic”:XXXX, “targetTopic” :XXXX } Event 1 Event 2 Event n Source topic lookup Processing Layer Sink Topic Source topic lookup Processing Layer Sink Topic Source topic lookup Processing Layer Sink Topic Metadata Cache
  • 11. Client Scenario # 3 11 3 Who is the client? Client is an American Healthcare retail giant. They have embarked on a project to improve the productivity of their store employees. As a part of a day’s schedule of the store employee, they have to execute many workflows. A project was initiated to design a target architecture that will automate part of their workflow steps with the help of Artificial Intelligence and Machine Learning. What were we trying to solve? Objective of this project is to design and implement a solution that would integrate with machine learning algorithms that learn from historical data. The machine learning algorithms would execute not only of data being sent on the transactional data but would also need reference and master data. As the algorithm would retrain and mature over time, there would be need for additional reference and master data. This would mandate a need of flexible data model for different master and reference data entities.
  • 12. How is MongoDB leveraged in this use case? Target State Architecture 12 3 A document based data model was a perfect choice to solve this use case and MongoDB was chosen as the source of reference to render this master and reference data entities to the Machine learning algorithms. It is also a perfect choice to log the output of the different algorithms and orchestration engine. This data would be used for transaction processing, operational reporting and production support.
  • 13. What were the salient features of MongoDB used in the use case? 13 3 {“_id”: {“patient_id” : Number}, “demographics”: { “dob” : String “gender”: String “zip_code”: string}, … } As more data attributes are needed for Machine learning algorithms, the schema free nature of MongoDB provides support for rapid development and deployment of changes Support for 250+ TPS across 50000 Terminals that support the store employee workflow. This is supported by the Active-Active deployment of the application. MongoDB sharding and replication across data centers supports the workflow. Reads from secondary to support performance and High Availability requirements Multiple processes are supported by Change Streams -Chain State Store level Settings and Roll out of Machine learning algorithms -Audit and event functions Operational Data to be persisted for 10 days and then the data gets purged. TTL Index defined to be only present for 10 days. Schema Free Design Sharding and Replication to achieve HA Change Streams Purge through TTL
  • 14. Client Scenario # 4 14 4 Who is the client? Client is major international online e-commerce platform provider. They are planning multi-year journey with multiple strategic objectives, 1. Replacing monolithic application using micro-services architecture 2. Considerations of European & Asian legal data residency requirements 3. Respond to highly elastic business demand on a short notice 4. Products quick to market What were we trying to solve? Part of the objective was to use right data storage technology and architecture to deliver a global scale application, converting services from a legacy application using distributed computing capable of rapid development, elastic capacity, data caging, high performance & fault tolerant.
  • 15. How is MongoDB leveraged in this use case? Architecture 15 4 MongoDB was the right choice with native capabilities like quick data modelling, shard tags for data residency, high performance and high availability, easy to add/remove capacity. MongoDB is also one of most matured data storage option in NoSQL/Document database domain for machine critical applications and a reliable partner for enterprise strategy towards open source adoption.
  • 16. What were the salient features of MongoDB used in the use case? 16 4 { name: "Midhuna", spouse: { name: "Akash", age: 25 } } Flexible schemas are effective to use with micro-services architecture, quick data models using complex data types, rapid development, easy adoption for development and design changes. Data model – Quick to market Supports a hybrid load of 5000 writes and 15000 reads per second. Data reads, writes can be highly distributes using data-sharding and replication features. Individual queries can be tuned using indexes based on data access patterns. High Performance & HA Using data-sharding and shard tags, data storage can be ensured at a specified location. This option can significantly help with country specific legal requirements and helps with performance by colocation of data with the users. Data Residency Ops Manager is very sophisticated tool from MongoDB, improves ease & efficiency of building and managing clusters significantly. In additional to build & operate clusters, Ops Manager also provides all administration capabilities to monitoring/notification, backups/recovery, auto failovers, performance metrics visibility etc. Easier to build and operate
  • 17. Client Scenario # 5 17 5 Who is the client? Client is American insurance firm. They decided to embark on a new journey to modernize customer’s IT applications & infrastructure design. The target architecture will allow new businesses and vendors to collaborate on centralized web portal. The new system will be moving away from the existing Oracle environment and will be capable of handling unstructured data streams from various agents and third party businesses. What were we trying to solve? Objective of this project is to maintain insurance information whose effective dates are in the past, present, or future. The system also need to turn back the clock and see agreement data as it was at a specific time in the past. This is useful for retrieving the historical agreement documents, for meeting statutory data retention requirements, and for understanding how your data changes as it moves from creation through obsolescence.
  • 18. How is MongoDB leveraged in this use case? Target State Architecture 18 A document store was a perfect choice for this requirement and MongoDB was chosen as it offers full ACID transaction support. The Persistence Engine will be designed to have the maximum flexibility to handle transactions for current and future business needs and implement separation of abstract data transactions from the business logic. The bi-temporal data transactions need to be transparent to the application without complicating the development and maintenance. 5
  • 19. Schema Less Design Data transactions Scaling through Sharding What were the salient features of MongoDB used in the use case? 19 The insurance data is inherently rich data with a wide variety of ever-changing data attributes. MongoDB is the leading document-store, which is designed to offer a rich experience with support for modern programming languages and rapid development techniques. MongoDB is the only database that fully combines the power of the document model and a distributed systems architecture with ACID guarantees. Through snapshot isolation, transactions provide a consistent view of data, and enforce all-or-nothing execution to maintain data integrity, even across sharded clusters. All State deals with volumes of data that requires handling ever-changing insurance information at scales. MongoDB dataset can linearly scale across multiple servers without the need of knowing when the physical horizontal scale comes into play. 5 Document Oriented DB MongoDB is the most mature document-oriented database in the database industry today. MongoDB inherently offers all the features and techniques suitable for storing and managing big data-sized collections of literal documents like text documents, email messages, XML documents, etc.
  • 20. © 2019 Infosys Limited, Bengaluru, India. All Rights Reserved. Infosys believes the information in this document is accurate as of its publication date; such information is subject to change without notice. Infosys acknowledges the proprietary rights of other companies to the trademarks, product names and such other intellectual property rights mentioned in this document. Except as expressly permitted, neither this documentation nor any part of it may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, printing, photocopying, recording or otherwise, without the prior permission of Infosys Limited and/ or any named intellectual property rights holders under this document. THANK YOU