Saturday, March 12, 2011

Justin.tv's Live Video Broadcasting Architecture

Read Article:
Justin.tv's Live Video Broadcasting Architecture


Platform

  1. Twice - custom web caching system. (http://code.google.com/p/twicecache/)
  2. XFS - file system.
  3. HAProxy - software load balancing.
  4. The LVS stack and ldirectord - high availability.
  5. Ruby on Rails - application server
  6. Nginx - web server.
  7. PostgreSQL - database used for user and other meta data.
  8. MongoDB - used for their internal analytics tools.
  9. MemcachedDB - used for handling high write data like view counters.
  10. Syslog-ng - logging service.
  11. RabitMQ - used for job system.
  12. Puppet - used to build servers.
  13. Git - used for source code control.
  14. Wowza - Flash/H.264 video server, plus lots of custome modules written in Java.
  15. Usher - custom business logic server for playing video streams.
  16. S3 - small image storage.

The Stats

  1. 4 datacenters spread through out the country.
  2. At any given time there's close to 2,000 incoming streams.
  3. 30 hours per minute of video is added each day.
  4. 30 million unique visitors a month.
  5. Average live bandwidth is about 45 gigabits per second. Daily peak bandwidth at about 110 Gbps. Largest spike has been 500 Gbps.
  6. Approximately 200 video servers, based on commodity hardware, each capable of sending 1Gbps of video. Smaller than most CDNs yet larger than most video websites.
  7. About 100TB of archival storage is saved per week.
  8. The complete video path can't have more than 250 msecs of latency before viewers start losing the ability to converse and interact in real-time.

No comments: