SlideShare uma empresa Scribd logo
1 de 17
Snowplow: scalable open source web and
event analytics platform, built on AWS
Using EMR, Redshift, Cloudfront and Elastic Beanstalk to build a
scalable, log-everything, query-everything data infrastructure
What is Snowplow?
• Web analytics platform
• Javascript tags -> event-level data delivered in your own Amazon Redshift or
PostgreSQL database, for analysis in R, Excel, Tableau
• Open source -> run on your own AWS account
• Own your own data
• Join with 3rd party data sets (PPC, Facebook, CRM)
• Analyse with any tool you want
• Architected to scale
• Ad networks track 100Ms of events (impressions) per day
• General purpose event analytics platform -> Universal Event Analytics
• Log-everything infrastructure works for web data and other event data sets
Why we built Snowplow
• Traditional web analytics tools are very limited
• Siloed -> hard to integrate
• Reports built for publishers and retailers in the 1990s
• Impressed by how easy AWS makes it to collect, manage and process massive
data sets
• More on this in a second…
• Impressed by new generation of agile BI tools
• Tableau, Excel, R…
• Commoditise and standardise event data capture (esp. data structure) -> enable
innovation in the use of that data
• Lots of tech companies have built a similar stack to handle data internally
• Makes sense for everyone to standardise around an open source product
Snowplow’s (loosely coupled) technical architecture
1. Trackers 2. Collectors 3. Enrich 4. Storage 5. AnalyticsB C D
A D Standardised data protocols
Generate event
data (e.g.
Javascript
tracker)
Receive data
from trackers
and log it to S3
Clean and
enrich raw data
(e.g. geoIP
lookup, session
ization, referrer
parsing)
Store data in
format suitable
to enable
analysis
The Snowplow technology stack: trackers
1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics
Javascript tracker
Pixel (No-JS) tracker
Arduino tracker
Lua tracker
Trackers on the roadmap:
• Java
• Python
• Ruby
• Android
• iOS…
The Snowplow technology stack: collectors
1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics
Cloudfront collector
Clojure collector
on Elastic Beanstalk
• Tracker: GET request to pixel hosted
on Cloudfront
• Event data appended to the GET
request as a query string
• Cloudfront logging -> data
automatically logged to S3
• Scalable – Cloudfront CDN built to
handle enormous volume and
velocity of requests
• Enable tracking users across
domains, by setting a 3rd party
cookie server side
• Clojure collector runs on Tomcat:
customize format of Tomcat logs to
match Cloudfront log file format
• Elastic Beanstalk supports rotation of
Tomcat logs into S3
• Scalable: Elastic Beanstalk makes it
easy to handle spikes in request
volumes
The Snowplow technology stack: data enrichment
1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics
Scalding Enrichment on EMR
• Enrichment process run 1-4x per day
• Consolidate log files from collector, clean up, enrich, and write back to storage (S3)
• Enrichments incl. referrer parsing, Geo-IP lookups, server-side sessionization
• Process written in Scalding: a Scala API for Cascading
• Cascading: a high level library for Hadoop esp. well suited for building robust data pipelines
(ETL) that e.g. push bad data into separate sinks to validated data
• Powered by EMR: cluster fired up to perform the enrichment step, then shut down
Hadoop and EMR are excellent for data enrichment
• For many, the volume of data processed with each run is not large enough to necessitate
a big data solution…
• … but building the process on Hadoop / EMR means it is easy to rerun the entire
historical Snowplow data set through Enrichment e.g.
• When a new enrichment becomes available
• When the company wants to apply a new definition of a key variable in their Snowplow data
set (e.g. new definition for sessionization, or new definition for user cohort) i.e. change in
business logic
• Reprocessing entire data set isn’t just possible -> it’s easy (as easy as just processing new
data) and fast (just fire up a larger cluster)
• This is game changing in web analytics, where reprocessing data has never been possible
Scalding + Scalaz make it easy for us to build rich, validated ETL
pipelines to run on EMR
• Scalaz is a functional programming library for Scala – it has a Validation data type which
lets us accumulate errors as we process our raw Snowplow rows
• Scalding + Scalaz lets us write ETL in a very expressive way:
• In the above, ValidatedMaybeCanonicalOutput contains either a valid Snowplow
event, or a list of validation failures (Strings) which were encountered trying to parse the
raw Snowplow log row
Scalding + Scalaz make it easy for us to build rich, validated ETL
pipelines to run on EMR (continued)
• Scalding + Scalaz lets us route our bad raw rows into a “bad bucket” in S3, along with all
of the validation errors which were encountered for that row:
• (This is pretty-printed – in fact the flatfile is one JSON object per line)
• In the future we could add an aggregation job to process these “bad bucket” files and
report on the number of errors encountered and most common validation failures
The Snowplow technology stack: storage and analytics
1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics
S3
Redshift
Postgres (coming soon)
Loading Redshift from an EMR job is relatively
straightforward, with some gotchas to be aware of
• Load Redshift from S3, not DynamoDB – the costs for loading from DynamoDB only
make sense if you need the data in DynamoDB anyway
• Your EMR job can either write directly to S3 (slow), or write to local HDFS and then
S3DistCp to S3 (faster)
• For Scalding, our Redshift table target is a POJO assembled using
scala.reflect.BeanProperty – with fields declared in same order as in Redshift:
Make sure to escape tabs, newlines etc in your strings
• Once we have Snowplow events in CanonicalOutput form, we simply unpack them into
tuple fields for writing:
• Remember you are loading tab-separated, newline terminated values into Redshift, so
make sure to escape all tabs, newlines, other special characters in your strings:
You need to handle field length too
• You can either handle string length proactively in your code, or add TRUNCATECOLUMNS to
your Redshift COPY command
• Currently we proactively truncate:
• BUT this code is not unicode-aware (Redshift varchar field lengths are in terms of
bytes, not characters) and rather fragile – we will likely switch to using
TRUNCATECOLUMNS
Then use STL_LOAD_ERRORS, Excel and MAXERROR to help
debug load errors
• If you do get load errors, then check STL_LOAD_ERRORS in Redshift – it gives you all the
information you need to fix the load error
• If the error is non-obvious, pull your POJO, Redshift table definition and bad row (from
STL_LOAD_ERRORS) into Excel to compare:
• COPY … MAXERROR X is your friend – lets you see more than just the first load error
TSV text files are great for feeding Redshift, but be careful of
using them as your “master data store”
• Some limitations to using tab-separated flat files to store your data:
• Inefficient for storage/querying – versus e.g. binary files
• Schemaless – no way of knowing the structure without visually eyeballing
• Fragile – problems with field length, tabs, newlines, control characters etc
• Inexpressive – no support for things like Union data types; rows can only be 65kb wide (you
can insert fatter rows into Redshift, but cannot query them)
• Brittle – adding a new field to Redshift means the old files don’t load; need to re-run the
EMR job over all of your archived input data to re-generate
• All of this means we will be moving to a more robust Snowplow event storage format
on disk (Avro), and simply generating TSV files from those Avro events as needed to
feed Redshift (or Postgres or Amazon RDS or …)
• Recommendation: write a new Hadoop job step to take your existing outputs from
EMR and convert into Redshift-friendly TSVs; don’t start hacking on your existing data
flow
Any questions?
?
Learn more
• https://github.com/snowplow/snowplow
• http://snowplowanalytics.com/
• @snowplowdata

Mais conteúdo relacionado

Destaque

Using Snowplow for A/B testing and user journey analysis at CustomMade
Using Snowplow for A/B testing and user journey analysis at CustomMadeUsing Snowplow for A/B testing and user journey analysis at CustomMade
Using Snowplow for A/B testing and user journey analysis at CustomMadeyalisassoon
 
Snowplow: where we came from and where we are going - March 2016
Snowplow: where we came from and where we are going - March 2016Snowplow: where we came from and where we are going - March 2016
Snowplow: where we came from and where we are going - March 2016yalisassoon
 
BenevolentTech - Harnessing the power of AI to accelerate global scientific d...
BenevolentTech - Harnessing the power of AI to accelerate global scientific d...BenevolentTech - Harnessing the power of AI to accelerate global scientific d...
BenevolentTech - Harnessing the power of AI to accelerate global scientific d...Project Juno
 
Dressipi - Personalised recommendation engine for fashion consumers
Dressipi - Personalised recommendation engine for fashion consumersDressipi - Personalised recommendation engine for fashion consumers
Dressipi - Personalised recommendation engine for fashion consumersProject Juno
 
Slamcore - Next-Generation SLAM
Slamcore  - Next-Generation SLAMSlamcore  - Next-Generation SLAM
Slamcore - Next-Generation SLAMProject Juno
 
Reconfigure.io - Cloud-based FPGA Acceleration for AI applications
Reconfigure.io - Cloud-based FPGA Acceleration for AI applicationsReconfigure.io - Cloud-based FPGA Acceleration for AI applications
Reconfigure.io - Cloud-based FPGA Acceleration for AI applicationsProject Juno
 
Big data meetup budapest adding data schemas to snowplow
Big data meetup budapest   adding data schemas to snowplowBig data meetup budapest   adding data schemas to snowplow
Big data meetup budapest adding data schemas to snowplowyalisassoon
 
Snowplow at DA Hub emerging technology showcase
Snowplow at DA Hub emerging technology showcaseSnowplow at DA Hub emerging technology showcase
Snowplow at DA Hub emerging technology showcaseyalisassoon
 
Implementing improved and consistent arbitrary event tracking company-wide us...
Implementing improved and consistent arbitrary event tracking company-wide us...Implementing improved and consistent arbitrary event tracking company-wide us...
Implementing improved and consistent arbitrary event tracking company-wide us...yalisassoon
 
Snowplow the evolving data pipeline
Snowplow   the evolving data pipelineSnowplow   the evolving data pipeline
Snowplow the evolving data pipelineyalisassoon
 
Snowplow at the heart of Busuu's data & analytics infrastructure
Snowplow at the heart of Busuu's data & analytics infrastructureSnowplow at the heart of Busuu's data & analytics infrastructure
Snowplow at the heart of Busuu's data & analytics infrastructureGiuseppe Gaviani
 
Yali presentation for snowplow amsterdam meetup number 2
Yali presentation for snowplow amsterdam meetup number 2Yali presentation for snowplow amsterdam meetup number 2
Yali presentation for snowplow amsterdam meetup number 2yalisassoon
 
Snowplow - Evolve your analytics stack with your business
Snowplow - Evolve your analytics stack with your businessSnowplow - Evolve your analytics stack with your business
Snowplow - Evolve your analytics stack with your businessGiuseppe Gaviani
 
Snowplow at Sigfig
Snowplow at SigfigSnowplow at Sigfig
Snowplow at Sigfigyalisassoon
 
Snowplow: evolve your analytics stack with your business
Snowplow: evolve your analytics stack with your businessSnowplow: evolve your analytics stack with your business
Snowplow: evolve your analytics stack with your businessyalisassoon
 
Snowplow Analytics and Looker at Oyster.com
Snowplow Analytics and Looker at Oyster.comSnowplow Analytics and Looker at Oyster.com
Snowplow Analytics and Looker at Oyster.comyalisassoon
 

Destaque (16)

Using Snowplow for A/B testing and user journey analysis at CustomMade
Using Snowplow for A/B testing and user journey analysis at CustomMadeUsing Snowplow for A/B testing and user journey analysis at CustomMade
Using Snowplow for A/B testing and user journey analysis at CustomMade
 
Snowplow: where we came from and where we are going - March 2016
Snowplow: where we came from and where we are going - March 2016Snowplow: where we came from and where we are going - March 2016
Snowplow: where we came from and where we are going - March 2016
 
BenevolentTech - Harnessing the power of AI to accelerate global scientific d...
BenevolentTech - Harnessing the power of AI to accelerate global scientific d...BenevolentTech - Harnessing the power of AI to accelerate global scientific d...
BenevolentTech - Harnessing the power of AI to accelerate global scientific d...
 
Dressipi - Personalised recommendation engine for fashion consumers
Dressipi - Personalised recommendation engine for fashion consumersDressipi - Personalised recommendation engine for fashion consumers
Dressipi - Personalised recommendation engine for fashion consumers
 
Slamcore - Next-Generation SLAM
Slamcore  - Next-Generation SLAMSlamcore  - Next-Generation SLAM
Slamcore - Next-Generation SLAM
 
Reconfigure.io - Cloud-based FPGA Acceleration for AI applications
Reconfigure.io - Cloud-based FPGA Acceleration for AI applicationsReconfigure.io - Cloud-based FPGA Acceleration for AI applications
Reconfigure.io - Cloud-based FPGA Acceleration for AI applications
 
Big data meetup budapest adding data schemas to snowplow
Big data meetup budapest   adding data schemas to snowplowBig data meetup budapest   adding data schemas to snowplow
Big data meetup budapest adding data schemas to snowplow
 
Snowplow at DA Hub emerging technology showcase
Snowplow at DA Hub emerging technology showcaseSnowplow at DA Hub emerging technology showcase
Snowplow at DA Hub emerging technology showcase
 
Implementing improved and consistent arbitrary event tracking company-wide us...
Implementing improved and consistent arbitrary event tracking company-wide us...Implementing improved and consistent arbitrary event tracking company-wide us...
Implementing improved and consistent arbitrary event tracking company-wide us...
 
Snowplow the evolving data pipeline
Snowplow   the evolving data pipelineSnowplow   the evolving data pipeline
Snowplow the evolving data pipeline
 
Snowplow at the heart of Busuu's data & analytics infrastructure
Snowplow at the heart of Busuu's data & analytics infrastructureSnowplow at the heart of Busuu's data & analytics infrastructure
Snowplow at the heart of Busuu's data & analytics infrastructure
 
Yali presentation for snowplow amsterdam meetup number 2
Yali presentation for snowplow amsterdam meetup number 2Yali presentation for snowplow amsterdam meetup number 2
Yali presentation for snowplow amsterdam meetup number 2
 
Snowplow - Evolve your analytics stack with your business
Snowplow - Evolve your analytics stack with your businessSnowplow - Evolve your analytics stack with your business
Snowplow - Evolve your analytics stack with your business
 
Snowplow at Sigfig
Snowplow at SigfigSnowplow at Sigfig
Snowplow at Sigfig
 
Snowplow: evolve your analytics stack with your business
Snowplow: evolve your analytics stack with your businessSnowplow: evolve your analytics stack with your business
Snowplow: evolve your analytics stack with your business
 
Snowplow Analytics and Looker at Oyster.com
Snowplow Analytics and Looker at Oyster.comSnowplow Analytics and Looker at Oyster.com
Snowplow Analytics and Looker at Oyster.com
 

Mais de yalisassoon

2016 09 measurecamp - event data modeling
2016 09 measurecamp - event data modeling2016 09 measurecamp - event data modeling
2016 09 measurecamp - event data modelingyalisassoon
 
Capturing online customer data to create better insights and targeted actions...
Capturing online customer data to create better insights and targeted actions...Capturing online customer data to create better insights and targeted actions...
Capturing online customer data to create better insights and targeted actions...yalisassoon
 
Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016
Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016
Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016yalisassoon
 
Modeling event data
Modeling event dataModeling event data
Modeling event datayalisassoon
 
The analytics journey at Viewbix - how they came to use Snowplow and the setu...
The analytics journey at Viewbix - how they came to use Snowplow and the setu...The analytics journey at Viewbix - how they came to use Snowplow and the setu...
The analytics journey at Viewbix - how they came to use Snowplow and the setu...yalisassoon
 
Customer lifetime value
Customer lifetime valueCustomer lifetime value
Customer lifetime valueyalisassoon
 
A KPI framework for startups
A KPI framework for startupsA KPI framework for startups
A KPI framework for startupsyalisassoon
 

Mais de yalisassoon (7)

2016 09 measurecamp - event data modeling
2016 09 measurecamp - event data modeling2016 09 measurecamp - event data modeling
2016 09 measurecamp - event data modeling
 
Capturing online customer data to create better insights and targeted actions...
Capturing online customer data to create better insights and targeted actions...Capturing online customer data to create better insights and targeted actions...
Capturing online customer data to create better insights and targeted actions...
 
Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016
Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016
Analytics at Carbonite: presentation to Snowplow Meetup Boston April 2016
 
Modeling event data
Modeling event dataModeling event data
Modeling event data
 
The analytics journey at Viewbix - how they came to use Snowplow and the setu...
The analytics journey at Viewbix - how they came to use Snowplow and the setu...The analytics journey at Viewbix - how they came to use Snowplow and the setu...
The analytics journey at Viewbix - how they came to use Snowplow and the setu...
 
Customer lifetime value
Customer lifetime valueCustomer lifetime value
Customer lifetime value
 
A KPI framework for startups
A KPI framework for startupsA KPI framework for startups
A KPI framework for startups
 

Último

Cannabis Legalization World Map: 2024 Updated
Cannabis Legalization World Map: 2024 UpdatedCannabis Legalization World Map: 2024 Updated
Cannabis Legalization World Map: 2024 UpdatedCannaBusinessPlans
 
Call 7737669865 Vadodara Call Girls Service at your Door Step Available All Time
Call 7737669865 Vadodara Call Girls Service at your Door Step Available All TimeCall 7737669865 Vadodara Call Girls Service at your Door Step Available All Time
Call 7737669865 Vadodara Call Girls Service at your Door Step Available All Timegargpaaro
 
PHX May 2024 Corporate Presentation Final
PHX May 2024 Corporate Presentation FinalPHX May 2024 Corporate Presentation Final
PHX May 2024 Corporate Presentation FinalPanhandleOilandGas
 
Berhampur Call Girl Just Call 8084732287 Top Class Call Girl Service Available
Berhampur Call Girl Just Call 8084732287 Top Class Call Girl Service AvailableBerhampur Call Girl Just Call 8084732287 Top Class Call Girl Service Available
Berhampur Call Girl Just Call 8084732287 Top Class Call Girl Service Availablepr788182
 
Putting the SPARK into Virtual Training.pptx
Putting the SPARK into Virtual Training.pptxPutting the SPARK into Virtual Training.pptx
Putting the SPARK into Virtual Training.pptxCynthia Clay
 
Getting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAI
Getting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAIGetting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAI
Getting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAITim Wilson
 
Lucknow Housewife Escorts by Sexy Bhabhi Service 8250092165
Lucknow Housewife Escorts  by Sexy Bhabhi Service 8250092165Lucknow Housewife Escorts  by Sexy Bhabhi Service 8250092165
Lucknow Housewife Escorts by Sexy Bhabhi Service 8250092165meghakumariji156
 
Falcon Invoice Discounting: The best investment platform in india for investors
Falcon Invoice Discounting: The best investment platform in india for investorsFalcon Invoice Discounting: The best investment platform in india for investors
Falcon Invoice Discounting: The best investment platform in india for investorsFalcon Invoice Discounting
 
Falcon Invoice Discounting: Empowering Your Business Growth
Falcon Invoice Discounting: Empowering Your Business GrowthFalcon Invoice Discounting: Empowering Your Business Growth
Falcon Invoice Discounting: Empowering Your Business GrowthFalcon investment
 
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai KuwaitThe Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwaitdaisycvs
 
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60% in 6 Months
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60%  in 6 MonthsSEO Case Study: How I Increased SEO Traffic & Ranking by 50-60%  in 6 Months
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60% in 6 MonthsIndeedSEO
 
Falcon Invoice Discounting: Unlock Your Business Potential
Falcon Invoice Discounting: Unlock Your Business PotentialFalcon Invoice Discounting: Unlock Your Business Potential
Falcon Invoice Discounting: Unlock Your Business PotentialFalcon investment
 
Berhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDING
Berhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDINGBerhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDING
Berhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDINGpr788182
 
JAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR ESCORTS
JAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR  ESCORTSJAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR  ESCORTS
JAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR ESCORTSkajalroy875762
 
Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...
Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...
Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...daisycvs
 
UAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur Dubai
UAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur DubaiUAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur Dubai
UAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur Dubaijaehdlyzca
 
Jual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan Cytotec
Jual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan CytotecJual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan Cytotec
Jual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan CytotecZurliaSoop
 
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...ssuserf63bd7
 
Nashik Call Girl Just Call 7091819311 Top Class Call Girl Service Available
Nashik Call Girl Just Call 7091819311 Top Class Call Girl Service AvailableNashik Call Girl Just Call 7091819311 Top Class Call Girl Service Available
Nashik Call Girl Just Call 7091819311 Top Class Call Girl Service Availablepr788182
 
PARK STREET 💋 Call Girl 9827461493 Call Girls in Escort service book now
PARK STREET 💋 Call Girl 9827461493 Call Girls in  Escort service book nowPARK STREET 💋 Call Girl 9827461493 Call Girls in  Escort service book now
PARK STREET 💋 Call Girl 9827461493 Call Girls in Escort service book nowkapoorjyoti4444
 

Último (20)

Cannabis Legalization World Map: 2024 Updated
Cannabis Legalization World Map: 2024 UpdatedCannabis Legalization World Map: 2024 Updated
Cannabis Legalization World Map: 2024 Updated
 
Call 7737669865 Vadodara Call Girls Service at your Door Step Available All Time
Call 7737669865 Vadodara Call Girls Service at your Door Step Available All TimeCall 7737669865 Vadodara Call Girls Service at your Door Step Available All Time
Call 7737669865 Vadodara Call Girls Service at your Door Step Available All Time
 
PHX May 2024 Corporate Presentation Final
PHX May 2024 Corporate Presentation FinalPHX May 2024 Corporate Presentation Final
PHX May 2024 Corporate Presentation Final
 
Berhampur Call Girl Just Call 8084732287 Top Class Call Girl Service Available
Berhampur Call Girl Just Call 8084732287 Top Class Call Girl Service AvailableBerhampur Call Girl Just Call 8084732287 Top Class Call Girl Service Available
Berhampur Call Girl Just Call 8084732287 Top Class Call Girl Service Available
 
Putting the SPARK into Virtual Training.pptx
Putting the SPARK into Virtual Training.pptxPutting the SPARK into Virtual Training.pptx
Putting the SPARK into Virtual Training.pptx
 
Getting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAI
Getting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAIGetting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAI
Getting Real with AI - Columbus DAW - May 2024 - Nick Woo from AlignAI
 
Lucknow Housewife Escorts by Sexy Bhabhi Service 8250092165
Lucknow Housewife Escorts  by Sexy Bhabhi Service 8250092165Lucknow Housewife Escorts  by Sexy Bhabhi Service 8250092165
Lucknow Housewife Escorts by Sexy Bhabhi Service 8250092165
 
Falcon Invoice Discounting: The best investment platform in india for investors
Falcon Invoice Discounting: The best investment platform in india for investorsFalcon Invoice Discounting: The best investment platform in india for investors
Falcon Invoice Discounting: The best investment platform in india for investors
 
Falcon Invoice Discounting: Empowering Your Business Growth
Falcon Invoice Discounting: Empowering Your Business GrowthFalcon Invoice Discounting: Empowering Your Business Growth
Falcon Invoice Discounting: Empowering Your Business Growth
 
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai KuwaitThe Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
The Abortion pills for sale in Qatar@Doha [+27737758557] []Deira Dubai Kuwait
 
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60% in 6 Months
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60%  in 6 MonthsSEO Case Study: How I Increased SEO Traffic & Ranking by 50-60%  in 6 Months
SEO Case Study: How I Increased SEO Traffic & Ranking by 50-60% in 6 Months
 
Falcon Invoice Discounting: Unlock Your Business Potential
Falcon Invoice Discounting: Unlock Your Business PotentialFalcon Invoice Discounting: Unlock Your Business Potential
Falcon Invoice Discounting: Unlock Your Business Potential
 
Berhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDING
Berhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDINGBerhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDING
Berhampur 70918*19311 CALL GIRLS IN ESCORT SERVICE WE ARE PROVIDING
 
JAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR ESCORTS
JAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR  ESCORTSJAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR  ESCORTS
JAJPUR CALL GIRL ❤ 82729*64427❤ CALL GIRLS IN JAJPUR ESCORTS
 
Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...
Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...
Quick Doctor In Kuwait +2773`7758`557 Kuwait Doha Qatar Dubai Abu Dhabi Sharj...
 
UAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur Dubai
UAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur DubaiUAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur Dubai
UAE Bur Dubai Call Girls ☏ 0564401582 Call Girl in Bur Dubai
 
Jual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan Cytotec
Jual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan CytotecJual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan Cytotec
Jual Obat Aborsi ( Asli No.1 ) 085657271886 Obat Penggugur Kandungan Cytotec
 
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
Horngren’s Cost Accounting A Managerial Emphasis, Canadian 9th edition soluti...
 
Nashik Call Girl Just Call 7091819311 Top Class Call Girl Service Available
Nashik Call Girl Just Call 7091819311 Top Class Call Girl Service AvailableNashik Call Girl Just Call 7091819311 Top Class Call Girl Service Available
Nashik Call Girl Just Call 7091819311 Top Class Call Girl Service Available
 
PARK STREET 💋 Call Girl 9827461493 Call Girls in Escort service book now
PARK STREET 💋 Call Girl 9827461493 Call Girls in  Escort service book nowPARK STREET 💋 Call Girl 9827461493 Call Girls in  Escort service book now
PARK STREET 💋 Call Girl 9827461493 Call Girls in Escort service book now
 

Snowplow presentation to hug uk

  • 1. Snowplow: scalable open source web and event analytics platform, built on AWS Using EMR, Redshift, Cloudfront and Elastic Beanstalk to build a scalable, log-everything, query-everything data infrastructure
  • 2. What is Snowplow? • Web analytics platform • Javascript tags -> event-level data delivered in your own Amazon Redshift or PostgreSQL database, for analysis in R, Excel, Tableau • Open source -> run on your own AWS account • Own your own data • Join with 3rd party data sets (PPC, Facebook, CRM) • Analyse with any tool you want • Architected to scale • Ad networks track 100Ms of events (impressions) per day • General purpose event analytics platform -> Universal Event Analytics • Log-everything infrastructure works for web data and other event data sets
  • 3. Why we built Snowplow • Traditional web analytics tools are very limited • Siloed -> hard to integrate • Reports built for publishers and retailers in the 1990s • Impressed by how easy AWS makes it to collect, manage and process massive data sets • More on this in a second… • Impressed by new generation of agile BI tools • Tableau, Excel, R… • Commoditise and standardise event data capture (esp. data structure) -> enable innovation in the use of that data • Lots of tech companies have built a similar stack to handle data internally • Makes sense for everyone to standardise around an open source product
  • 4. Snowplow’s (loosely coupled) technical architecture 1. Trackers 2. Collectors 3. Enrich 4. Storage 5. AnalyticsB C D A D Standardised data protocols Generate event data (e.g. Javascript tracker) Receive data from trackers and log it to S3 Clean and enrich raw data (e.g. geoIP lookup, session ization, referrer parsing) Store data in format suitable to enable analysis
  • 5. The Snowplow technology stack: trackers 1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics Javascript tracker Pixel (No-JS) tracker Arduino tracker Lua tracker Trackers on the roadmap: • Java • Python • Ruby • Android • iOS…
  • 6. The Snowplow technology stack: collectors 1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics Cloudfront collector Clojure collector on Elastic Beanstalk • Tracker: GET request to pixel hosted on Cloudfront • Event data appended to the GET request as a query string • Cloudfront logging -> data automatically logged to S3 • Scalable – Cloudfront CDN built to handle enormous volume and velocity of requests • Enable tracking users across domains, by setting a 3rd party cookie server side • Clojure collector runs on Tomcat: customize format of Tomcat logs to match Cloudfront log file format • Elastic Beanstalk supports rotation of Tomcat logs into S3 • Scalable: Elastic Beanstalk makes it easy to handle spikes in request volumes
  • 7. The Snowplow technology stack: data enrichment 1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics Scalding Enrichment on EMR • Enrichment process run 1-4x per day • Consolidate log files from collector, clean up, enrich, and write back to storage (S3) • Enrichments incl. referrer parsing, Geo-IP lookups, server-side sessionization • Process written in Scalding: a Scala API for Cascading • Cascading: a high level library for Hadoop esp. well suited for building robust data pipelines (ETL) that e.g. push bad data into separate sinks to validated data • Powered by EMR: cluster fired up to perform the enrichment step, then shut down
  • 8. Hadoop and EMR are excellent for data enrichment • For many, the volume of data processed with each run is not large enough to necessitate a big data solution… • … but building the process on Hadoop / EMR means it is easy to rerun the entire historical Snowplow data set through Enrichment e.g. • When a new enrichment becomes available • When the company wants to apply a new definition of a key variable in their Snowplow data set (e.g. new definition for sessionization, or new definition for user cohort) i.e. change in business logic • Reprocessing entire data set isn’t just possible -> it’s easy (as easy as just processing new data) and fast (just fire up a larger cluster) • This is game changing in web analytics, where reprocessing data has never been possible
  • 9. Scalding + Scalaz make it easy for us to build rich, validated ETL pipelines to run on EMR • Scalaz is a functional programming library for Scala – it has a Validation data type which lets us accumulate errors as we process our raw Snowplow rows • Scalding + Scalaz lets us write ETL in a very expressive way: • In the above, ValidatedMaybeCanonicalOutput contains either a valid Snowplow event, or a list of validation failures (Strings) which were encountered trying to parse the raw Snowplow log row
  • 10. Scalding + Scalaz make it easy for us to build rich, validated ETL pipelines to run on EMR (continued) • Scalding + Scalaz lets us route our bad raw rows into a “bad bucket” in S3, along with all of the validation errors which were encountered for that row: • (This is pretty-printed – in fact the flatfile is one JSON object per line) • In the future we could add an aggregation job to process these “bad bucket” files and report on the number of errors encountered and most common validation failures
  • 11. The Snowplow technology stack: storage and analytics 1. Trackers 2. Collectors 3. Enrich 4. Storage 5. Analytics S3 Redshift Postgres (coming soon)
  • 12. Loading Redshift from an EMR job is relatively straightforward, with some gotchas to be aware of • Load Redshift from S3, not DynamoDB – the costs for loading from DynamoDB only make sense if you need the data in DynamoDB anyway • Your EMR job can either write directly to S3 (slow), or write to local HDFS and then S3DistCp to S3 (faster) • For Scalding, our Redshift table target is a POJO assembled using scala.reflect.BeanProperty – with fields declared in same order as in Redshift:
  • 13. Make sure to escape tabs, newlines etc in your strings • Once we have Snowplow events in CanonicalOutput form, we simply unpack them into tuple fields for writing: • Remember you are loading tab-separated, newline terminated values into Redshift, so make sure to escape all tabs, newlines, other special characters in your strings:
  • 14. You need to handle field length too • You can either handle string length proactively in your code, or add TRUNCATECOLUMNS to your Redshift COPY command • Currently we proactively truncate: • BUT this code is not unicode-aware (Redshift varchar field lengths are in terms of bytes, not characters) and rather fragile – we will likely switch to using TRUNCATECOLUMNS
  • 15. Then use STL_LOAD_ERRORS, Excel and MAXERROR to help debug load errors • If you do get load errors, then check STL_LOAD_ERRORS in Redshift – it gives you all the information you need to fix the load error • If the error is non-obvious, pull your POJO, Redshift table definition and bad row (from STL_LOAD_ERRORS) into Excel to compare: • COPY … MAXERROR X is your friend – lets you see more than just the first load error
  • 16. TSV text files are great for feeding Redshift, but be careful of using them as your “master data store” • Some limitations to using tab-separated flat files to store your data: • Inefficient for storage/querying – versus e.g. binary files • Schemaless – no way of knowing the structure without visually eyeballing • Fragile – problems with field length, tabs, newlines, control characters etc • Inexpressive – no support for things like Union data types; rows can only be 65kb wide (you can insert fatter rows into Redshift, but cannot query them) • Brittle – adding a new field to Redshift means the old files don’t load; need to re-run the EMR job over all of your archived input data to re-generate • All of this means we will be moving to a more robust Snowplow event storage format on disk (Avro), and simply generating TSV files from those Avro events as needed to feed Redshift (or Postgres or Amazon RDS or …) • Recommendation: write a new Hadoop job step to take your existing outputs from EMR and convert into Redshift-friendly TSVs; don’t start hacking on your existing data flow
  • 17. Any questions? ? Learn more • https://github.com/snowplow/snowplow • http://snowplowanalytics.com/ • @snowplowdata