So I was working on a project that had some fairly high coupling between our system and the data we got from an API. Then someone wanted a dashboard to summarize a bunch of stuff and instead of building some “cache models” in some database I decided to just cache the attributes the API sent us using standard Rails cache. What caching engine you use is up to you, but this will be faster if held in a memory based cache (duh).
Since this was at least 3 models being fed by the API I figured lets use a Concern! Check it out:
Since we had to define the class level find for our API Backed models anyway, we decided to couple that find method with the write_local_cache method
So to grab a model from the cache I simply had to do
in the controllers feeding this dashboard. Alternatively you can do this in relationship methods on a model.
So this ended up taking us from a 65 second page load that result in 400 API requests to about a 1.5 second page load with 0 API requests. And the business was fine with this data being anywhere from 1-12 hours out of date. And since this cache is so long lived I wrote a rake task to preheat the cache that I run every day at 6am before the app starts getting used heavily:
namespace :app_name do desc "Pre-heats the cache based off current work assignments" task preheat_cache: :environment do Assignment.not_worked.find_each do |assignment| customer = Customer.find assignment.customer_id customer.total_due # this preheats a relationship to another API backed model end end end