O slideshow foi denunciado.
Seu SlideShare está sendo baixado. ×

Exploring the Data science Process

Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Próximos SlideShares
Introduction on Data Science
Introduction on Data Science
Carregando em…3
×

Confira estes a seguir

1 de 66 Anúncio

Mais Conteúdo rRelacionado

Diapositivos para si (20)

Semelhante a Exploring the Data science Process (20)

Anúncio

Mais recentes (20)

Exploring the Data science Process

  1. 1. Vishal Patel June 2017 Exploring the Data Science Process
  2. 2.  Vishal Patel  Data Science Consultant  Founder of DERIVE, LLC  Fully Automated Advanced Analytics products  Data Science services  MS in Computer Science, and MS in Decision Sciences  Richmond, VA w w w. d e r i v e . i o
  3. 3. Data Science
  4. 4. HOW TO BUILD A MACHINE LEARNING MODEL IN THREE QUICK AND EASY STEPS USING [XXX]!!! – Most tutorials 1
  5. 5. 50% of analytic projects fail. – Gartner, 2015 2
  6. 6. On September 21, 2009, the grand prize of US$1,000,000 was given to the BellKor's Pragmatic Chaos team which bested Netflix's own algorithm for predicting ratings by 10.06%.
  7. 7. “[T]he additional accuracy gains that we measured did not seem to justify the engineering effort needed to bring them into a production environment.”
  8. 8. Analytic projects fail because… …they aren’t completed within budget or on schedule, or because they fail to deliver the features and benefits that are optimistically agreed on at their outset.
  9. 9. How to Avoid Failure? 1 Build with Organizational Buy-in 2 Build with End In Mind 3 Build with Structured Approach
  10. 10. How to Avoid Failure? 1 Build with Organizational Buy-in 2 Build with End In Mind 3 Build with Structured Approach
  11. 11. Data Data Science Value
  12. 12. Data Data Science Value Business Question
  13. 13. Data Science
  14. 14. The Blind Men and the Elephant It was six men of Indostan To learning much inclined, Who went to see the Elephant (Though all of them were blind), That each by observation Might satisfy his mind. And so these men of Indostan Disputed loud and long, Each in his own opinion Exceeding stiff and strong, Though each was partly in the right And all were in the wrong! – John Godfrey Saxe
  15. 15. Data Science STATISTICS BUSINESS COMPUTER SCIENCE
  16. 16. Data Munging Model Training Model Deployment Data Preparation Model Tracking Business Understanding Model Evaluation Visualize Results Data Science Process STATISTICSBUSINESS COMPUTER SCIENCE 1 2 3 4 5 6 7
  17. 17. Data Munging Model Training Model Deployment Data Preparation Model Tracking Business Understanding Model Evaluation Visualize Results Data Science Process STATISTICSBUSINESS COMPUTER SCIENCE
  18. 18. Business Understanding Data Preparation Data Munging Model Training Model Evaluation Model Deployment Model Tracking
  19. 19. Business Understanding Far better an approximate answer to the right question than an exact answer to the wrong question. – John Tukey
  20. 20. Business Understanding 1 2 3 DETERMINE UNDERSTAND MAP
  21. 21. Business Understanding 1 2 3 DETERMINE UNDERSTAND MAP What does the client want to achieve? Primary Objective o Reduce attrition o Customized targeting o Plan future media spend o Prevent fraud o Recommend Products
  22. 22. Business Understanding 1 2 3 DETERMINE UNDERSTAND MAP o Identify secondary or competing objectives o List assumptions, constraints, and important factors o Understand success criteria o Specific, measurable, time-bound o Study existing solutions (if any)
  23. 23. Business Understanding 1 2 3 DETERMINE UNDERSTAND MAP o State the project objective(s) in technical terms o Describe how the data science project will help solve the business problem o Explore successful scenarios Business Objective  Technical Objective
  24. 24. Business Understanding OBJECTIVE TECHNIQUE EXAMPLES Predict Values Regression Linear regression, Bayesian regression, Decision Trees Predict Categories Classification Logistic regression, SVM, Decision Trees Predict Preference Recommender System Collaborative / Content- based filtering Discover groups Clustering k-means, Hierarchical clustering Discover unusual data points Anomaly Detection k-NN, One-class SVM … 1 2 3 DETERMINE UNDERSTAND MAP
  25. 25. If all you have is a hammer then everything looks like a nail.
  26. 26. Business Understanding o Primary Objective: Prevent attrition  Increase subscription renewals o Completive Objective: Core customers are also targeted for up-sell o Constraints: Avoid targeting customers too close to their contract expiration o Success Criteria: Current renewal rate = 65%  Improve by 8% o Existing Solution: Business-rule-based targeting o Data Science Objective: Build a binary classification model to identify customers who are not likely to renew their subscriptions three months in advance of their contract expiration. o Success Scenario: The model correctly identifies 80% of the future attritors. And the promotional campaign targets all likely attritors, and successfully converts 19% of them into non-attritors.
  27. 27. Business Understanding o Duration o Inventory of resources o Tools and techniques o Risks and contingencies o Costs and benefits o Milestones The thought that disaster is impossible often leads to an unthinkable disaster. – Gerald Weinberg Project Plan Titanic at Southampton docks, prior to departure
  28. 28. Data Preparation 1 IDENTIFY 2 COLLECT 3 ASSESS 4 VECTORIZE
  29. 29. Data Preparation o Data sources, formats o Database, Streaming API’s, Logs, Excel files, Websites, etc. o Entity Relationship Diagram (ERD) o Identify additional data sources o Demographics data appends o Geographical data o Census data, etc. o Identify relevant data o Record unavailable data o How long a history is available and one should use? 1 IDENTIFY 2 COLLECT 3 ASSESS 4 VECTORIZE
  30. 30. Data Preparation o Access or acquire all relevant data in a central location o Quality control checks and tests o File formats, delimiters o Number of records, columns o Primary keys 1 IDENTIFY 2 COLLECT 3 ASSESS 4 VECTORIZE
  31. 31. Data Preparation First look at the data o Get familiar with the data o Study seasonality o Monthly/weekly/daily patterns o Unexplained gaps or spikes in the historical data o Detect mistakes o Extreme or outlier values o Unusual values o Special missing values o Check assumptions o Review distributions 1 IDENTIFY 3 ASSESS 4 VECTORIZE 2 COLLECT
  32. 32. Tidy dataset are all alike; Every messy dataset is messy in its own way.
  33. 33. Data Preparation GOAL: Create Analysis Dataset 1 IDENTIFY 4 VECTORIZE 2 COLLECT 3 ASSESS 𝑦1 𝑦2 𝑦3 . . . 𝑦𝑛 𝑥11 𝑥12 𝑥13 … 𝑥1𝑗 𝑥21 𝑥22 𝑥23 … 𝑥2𝑗 𝑥31 𝑥32 𝑥33 … 𝑥3𝑗 . . . . . . . . . . . . 𝑥 𝑛1 𝑥 𝑛2 𝑥 𝑛3 … 𝑥 𝑛𝑗 𝑦 = 𝑋 = Outcome Target Independent Variable Inputs Features Dependent Variables
  34. 34. Target Definition o Churn = 90 days of consecutive inactivity o What’s inactivity? o Incoming and outgoing calls o Data usage o Incoming text o Promotional texts o Voicemail usage o Call forwarding o Etc. o What to do with customers who change their device or phone number? o Churn at the individual (person) level, or at the device (phone) level? o How many customers return (become active again) after 90 days of inactivity? o Prediction window o Predict 90 days of consecutive inactivity? o Would 10 days of consecutive inactivity suffice? o How many customers return after x days of inactivity? o How to deal with fraud? o Etc.
  35. 35. Modeling Sample o Historical trends and seasonality o Are there certain timeframes that should be discarded? o The model should be generalizable for future targeting purposes o Eligible, relevant population o Must align with the business goals o Eligible, relevant markets (e.g., US vs. International) o Must align with the business goals o Outdated products
  36. 36. Selection Bias Abraham Wald's Work on Aircraft Survivability Journal of the American Statistical Association Vol. 79, No. 386 (Jun., 1984)
  37. 37. Information Leakage … Sep Oct Nov Dec Jan Feb Mar Apr …to predict whether customers will [do something] Use all available information (“Leading Indicators”) as of the end of Jan… OBSERVATION WIDOW PREDICTION WIDOW o The leading indicators must be calculated from the timeframe leading up to the event – it must not overlap with the prediction window. o Beware of proxy events, e.g., future bookings
  38. 38. Data Aggregation o Attribute creation o Derived attributes: Household income / Number of adults = Income per adult o Brainstorm with team members 𝑥11 𝑥12 𝑥13 … 𝑥1𝑗 𝑥21 𝑥22 𝑥23 … 𝑥2𝑗 𝑥31 𝑥32 𝑥33 … 𝑥3𝑗 . . . . . . . . . . . . 𝑥 𝑛1 𝑥 𝑛2 𝑥 𝑛3 … 𝑥 𝑛𝑗 𝑋 =
  39. 39. Data Aggregation CUSTOMER_ID PURCHASE_DATE 1001 02-12-2015:05:20:39 1001 05-13-2015:12:18:09 1001 12-20-2016:00:15:59 1002 01-19-2014:04:28:54 1003 01-12-2015:09:20:36 1003 05-31-2015:10:10:02 … … 1. Number of transactions (Frequency) 2. Days since the last transaction (Recency) 3. Days since the earliest transaction (Tenure) 4. Avg. days between transaction 5. # of transactions during weekends 6. % of transactions during weekends 7. # of transactions by day-part (breakfast, lunch, etc.) 8. % of transactions by day-part 9. Days since last transaction / Avg. days between transactions 10.… CUSTOMER_ID 𝑥1 𝑥2 … 𝑥𝑗 1001 … … … 1002 … … … 1003 … … … … … … … …
  40. 40. Data Preparation 1 IDENTIFY 4 VECTORIZE 2 COLLECT 3 ASSESS OUTPUT: Analysis Dataset 𝑦1 𝑦2 𝑦3 . . . 𝑦𝑛 𝑥11 𝑥12 𝑥13 … 𝑥1𝑗 𝑥21 𝑥22 𝑥23 … 𝑥2𝑗 𝑥31 𝑥32 𝑥33 … 𝑥3𝑗 . . . . . . . . . . . . 𝑥 𝑛1 𝑥 𝑛2 𝑥 𝑛3 … 𝑥 𝑛𝑗 𝑦 = 𝑋 =
  41. 41. Entity Relationship Diagram (ERD)
  42. 42. Data Munging Model Training Model Evaluation Model Deployment Model Tracking Data Preparation Business Understanding 80% 20% Data Munging Model Building Time Spent
  43. 43. Give me six hours to chop down a tree and I will spend the first four sharpening the axe. – Abraham Lincoln
  44. 44. Data Munging o Descriptive statistics o Review with business owners o Correlation analysis o Review with business owners o Watch out for data leakage o Impute missing values o Trim extreme values o Process categorical attributes o Transformations (square, log, etc.) o Binning / variable smoothing o Multicollinearity o Reduce redundancy o Create additional feature o Interactions o Normalization (scaling)
  45. 45. Univariate Multivariate Non-Graphical Graphical o Cross-tabulation o Univariate statistics by category o Correlation matrices o Histograms o Box plots, stem-and-leaf plots o Quantile-normal plots o Categorical: Tabulated frequencies o Quantitative: o Central tendency: mean, median, mode o Spread: Standard deviation, inter- quartile range o Skewness and kurtosis o Univariate graphs by category (e.g., side-by-side box-plots) o Scatterplots o Correlation matrix plots Data Munging
  46. 46. Data Munging  Feature Reduction: The process of selecting a subset of features for use in model construction  Useful for both supervised and unsupervised learning problems Art is the elimination of the unnecessary. – Pablo Picasso
  47. 47. Data Munging  True dimensionality <<< Observed dimensionality  The abundance of redundant and irrelevant features  Curse of dimensionality  With a fixed number of training samples, the predictive power reduces as the dimensionality increases. [Hughes phenomenon]  With 𝑑 binary variables, the number of possible combinations is 𝑂(2 𝑑 ).  Value of Analytics  Descriptive  Diagnostic  Predictive  Prescriptive  Law of Parsimony [Occam’s Razor]  Other things being equal, simpler explanations are generally better than complex ones.  Overfitting  Execution time (Algorithm and data) Hindsight Insight Foresight Feature Reduction: Why
  48. 48. Data Munging 1. Percent missing values 2. Amount of variation 3. Pairwise correlation 4. Multicolinearity 5. Principal Component Analysis (PCA) 6. Cluster analysis 7. Correlation (with the target) 8. Forward selection 9. Backward elimination 10. Stepwise selection 11. LASSO 12. Tree-based selection Feature Reduction Techniques
  49. 49. All you can eat Eat healthy Feature Reduction
  50. 50. Model Training  Try more than one machine learning technique  Fine-tune parameters  Assess model performance  Avoid Over-fitting
  51. 51. Assess Model Performance  Area Under the ROC Curve (AUC), Confusion Matrix, Precision, Recall, Log-loss  Model Lift, Model Gains, Kolmogorov-Smirnov (KS), etc.
  52. 52. When a measure becomes a target, it ceases to be a good measure. Goodhart's law
  53. 53. Model Evaluation  Hold-out (test) set  Tri-fold partitioning, k-fold cross-validation  Out-of-time test set  Value of analytics  Law of Parsimony (Occam’s Razor)  Model execution time  Deployment complexity  Compare against existing business rules/model  Model peer-review (Quality Control) Model Selection and Performance Assessment
  54. 54. Bias-Variance Tradeoff
  55. 55. Model Evaluation  AUC  Cumulative Gains chart  Model usage recommendations  Decile reports  How many deciles should be targeted?  Predictor Importance  Each predictor’s relationship with the target  Interpret results as they relate to the business application Visualization of Modeling Results
  56. 56.  Primary bullet  Secondary bullet Visual Output . . . . . . . . . .
  57. 57. Model Deployment  Model production cycle (daily, weekly, monthly, etc.)  Scoring code, or publish model as a web service  Model Documentation (Technical Specifications)  Data preparation, transformations, imputations, parameter settings, etc.  Reproducibility  Docker containers, etc.  Model Persistence vs. Model Transience
  58. 58. Model Persistence vs. Model Transience Aug Sep Oct Nov Dec Jan Feb Mar Apr Model Build Model Decay Tracking Model Rebuild Model Decay Tracking Scoring Algorithm Scoring Algorithm Model Persistence Aug Sep Oct Nov Dec Model Build Model Transience Model Build Model Build Model Build Model Build • Traditional approach • Provides stability • Less resource intensive • Modern approach • Able to capture recent trends • Resource intensive
  59. 59. Model Tracking  Model decay tracking (monitoring) plan  Model maintenance plan  Adding new data sources  Version control  Campaign Set-up and Execution  Experimental Design  Reason Coding
  60. 60. No Selection (Random) Experimental Design Marketing Treatment No Treatment Selection Based on Model A Test C Control B Selection Hold-out D Random Hold-out
  61. 61. Measure Improve Test Analytics is a river.
  62. 62. Data Munging Model Building Model Deployment Data Preparation Model Tracking Business Understanding Model Evaluation Visualize Results Data Science Process: Recap STATISTICSBUSINESS COMPUTER SCIENCE
  63. 63. Data Munging Model Building Model Deployment Model Evaluation Visualize Results AutoClassifier These steps usually take 2 to 3 weeks But with AutoClassifier, it takes a few mouse-clicks and less than 24 hours w w w . d e r i v e . i o
  64. 64. Process as Proxy Good process serves you so you can serve customers. But if you’re not watchful, the process can become the proxy for the result you want. You stop looking at outcomes and just make sure you’re doing the process right. Gulp. It’s always worth asking, do we own the process or does the process own us? – Jeff Bezos
  65. 65. THANK YOU! vishal@derive.io www.linkedIn.com/in/VishalJP w w w . d e r i v e . i o

×