O slideshow foi denunciado.
Utilizamos seu perfil e dados de atividades no LinkedIn para personalizar e exibir anúncios mais relevantes. Altere suas preferências de anúncios quando desejar.

UXPA 2016: Mixed Methods Research in the Age of Big Data

779 visualizações

Publicada em

UX professionals have a long history of blending quantitative and qualitative research to better understand the customer experience. As Data Science has emerged as a discipline (with an increasing amount of hype), it's all too easy to engage only during results time, sharing information but working independently. At UXPA 2016, I made the case for deeper collaboration between UX professionals and Data Scientists during research and analysis time, for the sake of better Design outcomes for all.

Publicada em: Dados e análise
  • Seja o primeiro a comentar

UXPA 2016: Mixed Methods Research in the Age of Big Data

  1. 1. Mixed Methods Research in the Age of Big Data A Primer for UX Professionals
  2. 2. Zachary Sam Zaiss UX Data Scientist | Microsoft Cloud @zszaiss 2006 2012 2016 UX Researcher UX DS Berkeley MIDS
  3. 3. Gartner Hype Cycle for Emerging Technologies: 2014 http://www.gartner.com/newsroom/id/2819918
  4. 4. Gartner Hype Cycle for Emerging Technologies: 2015 http://www.gartner.com/newsroom/id/3114217
  5. 5. The Education Perspective https://whatsthebigdata.com/2012/08/09/graduate-programs-in-big-data-and-data-science/ http://uxmastery.com/resources/ux-degrees/ 84 78Graduate Degree Programs in Data Science Graduate Degree Programs in UX
  6. 6. http://radar.oreilly.com/2013/10/design-thinking-and-data-science.html
  7. 7. We need to make collaboration with Data Scientists a priority… … and it starts with a conversation
  8. 8. Tip #1 Stake Your Claim
  9. 9. Qualitative Evaluation Criteria Talking Points Quantitative basis for n values https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ http://www.measuringu.com/blog/five-history.php http://www.measuringu.com/blog/five-for-five.php
  10. 10. Qualitative Evaluation Criteria Talking Points Quantitative basis for n values Existence Proof https://www.youtube.com/watch?v=3uqZPnxG4_w
  11. 11. Qualitative Evaluation Criteria Talking Points Quantitative basis for n values Existence Proof Grounded Theory Inductive vs. Deductive Reasoning http://www.slideshare.net/traincroft/hcic-muller-guha-davis-geyer-shami-2015-0629 Theory from Data Data from Theory
  12. 12. Qualitative Evaluation Criteria Talking Points Quantitative basis for n values Existence Proof Grounded Theory Inductive vs. Deductive Reasoning Constructivism vs. Determinism https://us.sagepub.com/en-us/nam/research-design/book237357
  13. 13. Discussing evaluation criteria for qualitative research needs to be second nature.
  14. 14. What is your mic drop moment?
  15. 15. Tip #1 Stake Your Claim
  16. 16. Tip #2 Speak the Language
  17. 17. vs
  18. 18. Models + Key Aspects of Analysis Descriptive Model Descriptive Statistics Statistical Significance What is the probability of obtaining this result given the null hypothesis is true? Practical Significance Is the effect on the outcome large enough to be considered relevant?
  19. 19. http://fivethirtyeight.com/features/statisticians-found-one-thing-they-can-agree-on-its-time-to-stop-misusing-p-values/ The statement process was lengthier and more controversial than anticipated.
  20. 20. 6 Principles for p-values from ASA’s Statement 1. P-values can indicate how incompatible the data are with a specified statistical model. 2. P-values do not measure the probability that the studied hypothesis is true, or the probability that the data were produced by random chance alone. 3. Scientific conclusions and business or policy decisions should not be based only on whether a p-value crosses a specific threshold. 4. Proper inference requires full reporting and transparency. 5. A p-value, or statistical significance, does not measure the size of an effect or the importance of a result. 6. By itself, a p-value does not provide a good measure of evidence regarding a model or hypothesis. http://amstat.tandfonline.com/doi/abs/10.1080/00031305.2016.1154108
  21. 21. Models + Key Aspects of Analysis Descriptive Model Descriptive Statistics Statistical Significance What is the probability of obtaining this result given the null hypothesis is true? Practical Significance Is the effect on the outcome large enough to be considered relevant? Predictive Model Super vised Machine Learning Accuracy How well does the model predict the outcome for new data cases?
  22. 22. https://www.captionbot.ai/
  23. 23. Models + Key Aspects of Analysis Descriptive Model Descriptive Statistics Statistical Significance What is the probability of obtaining this result given the null hypothesis is true? Practical Significance Is the effect on the outcome large enough to be considered relevant? Predictive Model Super vised Machine Learning Accuracy How well does the model predict the outcome for new data cases? Representation Model Unsuper vised Machine Learning Optimization Criteria How will we determine that we’ve built a reasonable and appropriate representation model for our data?
  24. 24. vs
  25. 25. None of these measures get at the contextual meaning behind the model.
  26. 26. A Diagram for Product Manager… Source: Martin Eriksson, Mind the Product. http://www.mindtheproduct.com/2011/10/what-exactly-is-a-product-manager/
  27. 27. … And a Framework for Attributes UX Business Tech Experience Attributes Customer attributes that can explain how that customer will experience a product. Technology Attributes Customer attributes that can explain whether customers will have technical issues with a product. Business Attributes Customer attributes that can explain the extent to which the customer will contribute to business outcomes.
  28. 28. Example: Developer Tools X B T Prog Language Target Platform Project Complexity Project Audience Type of App Educational Background Keyboard Proclivity Project Complexity
  29. 29. Example: Freemium Games X B T Platform Used Facebook Connected Whale Status Completionist Tendencies Game Session Time
  30. 30. Example: Fitness Bands X B T Connected Devices Type / Version Frequency of Exercise Friends with Same Band Finger Shape (Fat Fingers) Farsightedness Skin Irritation
  31. 31. We are uniquely qualified to articulate the experience attributes of our products.
  32. 32. Tip #2 Speak the Language
  33. 33. Tip #3 Get Involved
  34. 34. A Metaphor for A/B Experiments
  35. 35. A Better Metaphor for A/B Experiments
  36. 36. How can we provide greater context to A/B test findings?
  37. 37. Heterogeneous Treatment Effects control treatment some kpi 0.71 0.72 product experts product novices control treatment converted didn‘t convert converted didn‘t convert control treatment converted didn‘t convert converted didn‘t convert Heterogeneous Treatment Effect
  38. 38. https://datadialogs.ischool.berkeley.edu/2014/schedule/experiments-action
  39. 39. Tip #1: Stake Your Claim Tip #2: Speak the Language Tip #3: Get Involved
  40. 40. Mixed Methods Research in the Age of Big Data A Primer for UX Professionals http://www.uxpa.org/sessionsurvey?sessionid=113

×