The team at TGFI recently launched a new web site for the Big South Conference. Through our initial research process we identified three key initiatives:
- They had an incredible amount of content that needed to be well organized.
- It had to look good on a variety of devices.
- It had to be fast. Really fast.
This project, however, was a bit different. We knew from early research that over half of the people using the site were on mobile devices – one of the highest percentages we’ve ever seen. They also have large spikes in their traffic when people rush to the site to watch football or basketball games, which compounds when multiple games are streaming simultaneously.
How We Addressed It
There are a multitude of ways to handle this process, but here are some of the key techniques we used to address performance.
- We used fast content delivery networks (CDN) for video, images and files (pdf, Word, Excel, etc). These systems are optimized for lots of traffic and high bandwidth scenarios.
- We cached key pages, information and snippets (articles, menus, etc) in Memcached. We use these snippets so we only need to recreate something when it changes.
- We hosted it ourselves so that we have control over everything we can during a visit.
OK, But How Well Does This Really Work?
Extremely well. Here is our performance graph from the last week showing an average response time of just 32 milliseconds. If I had to guess, the average across most of our applications is somewhere between 750ms and 1 second. That’s a big difference.
I should be honest here though – that’s a misleading graph. It’s only showing the requests that make it to the Ruby on Rails application. If you take Varnish caching and our web servers into account, our average response time is under ten milliseconds. Even under heavier spikes of traffic, there has been no perceptible change.
For all those naysayers out there, you are wrong – Rails can scale. You just have to know what you’re doing.