O slideshow foi denunciado.
Seu SlideShare está sendo baixado. ×

It Worked for Ustream

Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Anúncio
Carregando em…3
×

Confira estes a seguir

1 de 25 Anúncio
Anúncio

Mais Conteúdo rRelacionado

Diapositivos para si (20)

Quem viu também gostou (20)

Anúncio

Semelhante a It Worked for Ustream (20)

Mais recentes (20)

Anúncio

It Worked for Ustream

  1. 1. ! It Worked for Ustream Product Development @ Ustream
  2. 2. Balázs Kereskényi product manager API, discovery Gergely Schmidt product manager, premium products
  3. 3. Making decisions by assumptions: “Let’s do this, because it worked for Spotify!” - you will never know why it worked for them.
  4. 4. Making decisions by experiments: “Let’s try this out, measure effectiveness, and see if it works!
  5. 5. What to test? Icon & text: +12% conversion. Myth #13: Icons enhance usability
  6. 6. What to test? • CTA microcopies: “Get started”, “Try out for free”, “Sign up”, “Learn more” etc. • Layout: arrangement of form fields, mixing elements on the whole page. • “Meta parameters” of a product: trial period length, price. • ...
  7. 7. How to measure effectiveness? Conversion (basically any kind of your metrics): • Increase in purchases, • Engagement, • Number of clicks, • Number of signups, • Form completions, • ...
  8. 8. Always question assumptions but be prepared that only 1 out of 8 A/B tests will drive you to significant changes.* *based on the tests done by appsumo.com
  9. 9. FAMOUS! A/B! TESTS
  10. 10. Marissa Mayer: 41 shades of blue
  11. 11. How Obama raised $60M by running a simple experiment Optimizely, Dan Siroker
  12. 12. Who to blame for the slow loading up? Facebook or iOS?
  13. 13. How to set up a test? • Targeting (when the test should start, visitor characteristics & segmentation, technical circumstances etc.), • Experiment type (A/B, multivariate, multi page), • Traffic allocation, • Goals, • Integration with other kind of analytics.
  14. 14. Technical background of a test • Frontend • Quick and dirty methods • Always having a usable fallback is anything goes wrong • Backend for bigger and more complicated tests • Measurement • GA and Optimizely events are our best friends
  15. 15. Understanding your results Chance to beat baseline. …or you can simply use isvalid.org anytime.
  16. 16. USTREAM! CASE! STUDIES
  17. 17. Changing directions: SaaS vs. content. Started with concluding tiny preliminary experiments: less content, pro CTAs. Ended up with split testing the whole page. editorialized pro purchases++
  18. 18. Ustream homepage test results • Tested on approx. 700k unique visitors. • No important general metrics harmed (pages / visit time on site & time on homepage). • 1% increase in purchases starting from the home page. • Showing 1/3 of content resulted only in halving the clicks.
  19. 19. A DIFFERENT APPROACH FOR VALIDATION:! GHETTO TESTING http://bit.ly/ghettozynga
  20. 20. Ghetto test: Pay with PayPal • Built by frontend engineers, only triggered from Optimizely • Decided not to build it because of low conversion rate (~2%)
  21. 21. Pro Purchase Flow: 1 or 2 steps? • Built by frontend engineers, only triggered from Optimizely • Currently live, analysis is on the way
  22. 22. This is not a test
  23. 23. ! https://ustream.tv/platform/unlimited! Thanks! @bkereskenyi@sgergely

×