2. Global TV, Powered By Fans
• TV, Movies & Music videos
• Subtitles created by avid fans for free in 160+
languages
•
•
•
•
•
1bn+ video views / year
400mm+ words translated by fans
23mm+ monthly active users
12mm+ mobile installs
17,000+ hours of global prime-time content
from 50+ countries
History
• Founded in Palo Alto, CA, out of beta as a
company in Dec 2010
• Offices in SF, Singapore, Seoul, Tokyo
• Investors: Greylock, Andreessen Horowitz,
Neoteny (Joi Ito), BBC, SK Planet …
Awards
• World Economic Forum Tech Pioneer ‘14
• WSJ Asia Most Innovative Companies ‘12
• TechCrunch Best International Start-Up ‘10
3. The Beginning
•
•
•
•
Founded in 2008 by Razmig Hovaghimian, Changseong
Ho and Jiwon Moon
Initially named ViiKii
Self funded
First engineering team in Korea
4. Viki 1.0 technology - 2008
•
•
•
•
•
Flash developers who built the subbing tools also built
the website
PHP + MySQL
Business logic in stored procedures
Very heavy feature set e.g nearly every object
supported threaded conversation and many were
loaded on each page
No caching
5. Inflection Point - 2010
•
•
•
Rapid user adoption. Big hits like Playful
Kiss
Website was slow and buggy. Every new
feature made it worse
Peak hours access had to be limited to users
who had made a donation to Viki
6. Viki 2.0 - 2010 to 2011
•
•
•
•
•
•
Viki moves base to Singapore and raises Series A of
$4.3 million in Dec 2010
Hires Pivotal Labs to solve scale problems and train
new full time engineers being hired
Website rewritten in Ruby on Rails and Postgres
Caching using Varnish and Memcache
Use Heroku as PAAS
Built IOS and Android App
7. Inflection point - 2012
•
•
•
•
•
•
Explosive adoption of mobile apps
Many partner apps and integrations
Millions more users all over the world.
Many requests > 150ms
Not enough separation of concerns
Single point of failure
8. Viki 3.0 - 2012 to now
•
•
•
•
•
Public API (http://dev.viki.com/v4/api/)
Multiple points of presence
High performance (most API calls < 25ms)
Read Optimized
Eventually consistent architecture
9. Eventually consistent
•
•
•
Single central data store
(source of truth)
Writes to a specific POP
are propagated to other
POPs through a central
queue.
Typical writes propagate
within seconds
Public API
Internal
API
POP
POP
Queue
POP
Worker
DB
10. Points of Presence
•
•
•
Multiple POPs increase
fault tolerance
Latency based DNS
routing (Route53) so
clients access closest
healthy POP
Currently have 4 POPs two in the US, one in
Europe and one in
Singapore
API Proxy/Caching Layer
Nginx
Hyperion
Cache (Redis)
11. High performance
•
•
•
Network Time - API Requests served by nearest POP
Generation Time - Data model tuned for performance
with extensive use of precomputed in-memory data
structures. Most calls returned in < 25ms
Render Time - Rich API reduce client side operations
12. Takeaways
•
•
•
•
It is normal for your architecture/code to run
its course and be replaced
Need buy in from management to make
revolutionary rather than evolutionary
changes
No Technology Religion
Be humble and keep learning
13. Where are we headed
•
•
•
•
•
•
More social
Personalized
Huge growth in content (library and on-air)
100 Million Users
Support for more devices and partners
Exploiting synergies and leveraging other
Depts - ID, Superpoints, Search etc.
14. Viki 4.0 technical challenges
•
•
•
•
•
•
Data partitioning/sharding
Search
Recommendations
Content Management
Analytics and Insights
Monitoring and Troubleshooting