Upload
renaud-morvan
View
1.531
Download
0
Embed Size (px)
DESCRIPTION
Rails fragment caching is one of simple tool against performance issue. Using memcache cache store you get a lean and scalable solution to cache part of request. Memcached support time based cache expiry which reduce down to zero the need for plumbing code for cache sweeping, while providing a decent generic solution to a lot of case. Yet when volumes is growing and fragment takes time to be generated it tends to create load of peak by synchronizing cache recalculation on multiple process, annihilating cache purposes and killing performance of the whole stack. The purpose of this talk is to describe the problem and propose an alternative cache store that preserves the simplicity of rails fragment cache mechanism based on time expiry while offering atomic cache invalidation and refreshing. This presentation was done during and for Paris.rb ruby user group on August '11. This should be considered experimental and open to discussion and improvement.
Citation preview
Copyright Dimelo SA www.dimelo.com
Rails performance:Controlled expiration on Fragment Caching
Renaud [email protected]
Copyright Dimelo SA www.dimelo.com
Load is skyrocketing when traffic increase
Copyright Dimelo SA www.dimelo.com
A known solution: fragment caching
Easy to implement:
<% cache("mycachekey") do %>
All the topics in the system:
<%= render :partial => "topic", :collection => Topic.find(:all) %>
<% end %>
But you have to sweep it !!!
Copyright Dimelo SA www.dimelo.com
Sweeping is hard
« There are only two hard things in Computer Science: cache invalidation and naming things. »
Copyright Dimelo SA www.dimelo.com
Sweeping is hard
• Memcache is the prefered way to do fragment caching
• Memcache implement key expirations
<% cache("mycachekey", :expires_in => 3.minutes) do %>
<% end %>
• Great solution for medium term, no sweeper to manage
Copyright Dimelo SA www.dimelo.com
But ... here is the load profile when traffic keep on increasing
Copyright Dimelo SA www.dimelo.com
Fragment cache is no silver bullet
• It synchronizes cached part across process
• The slower the calculation the worse the increase of the load
• Multiple process try to recalculate key def cache key, options = {}, &block
unless Rails.cache.read key, options
yield.tap do |result|
Rails.cache.write key, result, options
end
end
end
Copyright Dimelo SA www.dimelo.com
Solution: backgrounded sweeping
• hard• messy• buggy• cron/observer debugging is painful• code is split between multiple files
Copyright Dimelo SA www.dimelo.com
My solution: Atomic fragment caching
• Don’t use memcached built-in expiration
• Use software expiration
• When expiration time is reached, increase expiration and trigger cache recalculation
• Short expired period => few concurrent recalculation
11Copyright Dimelo SA www.dimelo.com
Quick proposal:
• Trigger only one recalculation for all processes
• Serve old cache during recalculation period
• Synchronous OR backgrounded recalculation
• Code just in one place, no plumbing• https://gist.github.com/1186564
11