Development

Use Redis Sets to track and expire cache keys in Rails

Arnaud Lachaume
Arnaud LachaumeLink to author's LinkedIn profile
Calendar
June 8, 2021
icon timer
4
min

This article shows you how to use Redis Sets to track and expire cache keys when your cache entries involve multiple resources. A lightweight and powerful approach for properly managing cache keys.

This is a red image showing connecting data points
Table of Content

TL;DR; You use Redis? Don't limit yourself to Rails.cache. Redis offers plenty of functionalities to manage your cache efficiently, including Sets and Lists to manage collections. If you get your hands dirty with Rails.cache.redis it will eventually pay off.

Caching is all about exhaustively expiring cache entries to avoid stale data.

A very common approach in fragment caching is to rely on record timestamps to ensure that your cache fragments do not serve stale versions of the underlying data. It's low maintenance and works well though it's still making database calls to check  record timestamps.

Another approach is to use event-driven expiration. You create cache entries and manually expire cache keys when involved resources get updated.

This approach requires more maintenance - as you must ensure that update events properly lead to cache expiration - but opens the door to more efficient and flexible caching.

Last week we talked about basic caching by implementing a Project.find_cached method that caches the result of the find method. For this we developed a module that automatically clears the find cache entry on save.

We quickly ran into more complexity as soon as Project.find_cached started to eager load related parents, essentially because we had to also expire the cache when parents were updated.

The solution we ended up doing is this:

It's not graceful but it works.

It's not graceful because there is a lot of code involved just to expire the Project cache entries on the company side. We can definitely do better.

If you use Redis in your Rails app then it's time to get your hands dirty with Rails.cache.redis.

Ensure Redis is properly setup

The redis-rb gem is not thread-friendly by default. If you use redis-rb without a connection pool you will end up with race conditions on Redis accesses.

Here is a proper setup for Redis in Rails (you can also read the Rails guide on Redis pooling).

First make sure your Gemfile includes the following:

Create a config file for redis:

Finally, edit your application.rb and specify your cache store:

Good! Now you're ready to use Redis.

Redis is more than Rails.cache

Rails.cache gives you access to read/read_multi, write/write_multi, increment and decrement functions. These methods are good to cover the basics but are missing one of the most powerful features of Redis: Sets and Lists.

Sets and Lists allow you to track collections directly in Redis. From a caching perspective it enables the ability to track relationships between cache keys by storing and retrieving these relationships as collections.

Looping back to Rails.cache, it's a much more efficient to handle collections via Redis Sets and Lists rather relying on some form of key pattern matching using Rails.cache.key_matcher  (in case you envisaged that solution)

Here are some examples of using Redis Sets and Lists in Ruby/Rails:

There are many other commands available for Sets and Lists. You can see them all here:

Now let's see how we can harness that new Redis power to improve our caching strategies.

Using Sets to register cache dependencies and expire them automatically

Manually expiring cache keys in associated resources is always a bit ugly. You have to implement custom expiration logic in a commit callback to manually expire cache keys which were built by another record class.

Let's try a reusable approach where foreign dependencies are declared by the cached resource and managed as a Set.

The following module provides reusable logic for Active Record models to register associated cache keys and expire them when the record is updated. You could include this model in Application Record directly.

With the HasCacheDependencies module any resource can declare a cache entry as being dependent on a record by invoking:

Including this module on parent associations allows us to simplify our cache expiration strategy for the Company <-> Project relationship.

It's so simple that we even added a parent User model on Project to show what it looks like with multiple resource dependencies.

Our new version of cache registration/expiration looks like:

The implementation above follows the natural logic of saying "upon creating this cache entry, please remember to expire it when  associated records get updated".

This approach is way more efficient and streamlined than the one presented at the beginning of the article because:

  1. Parent models do not need to make a single database call on after_commit to expire cache keys
  2. Parent models do not need to load all their projects on after_commit to expire cache keys - only the keys which were registered get cleared. It's an opportunistic approach.
  3. There is no custom logic in each parent model. The cache key registration/expiration pattern is reusable across models.

Sign-up and accelerate your engineering organization today !

Wrapping up

Using native Redis functionalities can open the door to many optimizations in your application, especially related to caching.

The example above is one of many. As we stated last week, optimizing the find method will not really make a difference in your app. But the pattern of registering/expiring cache dependencies can help you put in place many complex caching strategies in your app.

Beyond caching it's possible to use Redis as a complete datastore and almost completely bypass your database. We might show how in a future blog article.

Happy caching!

About us

Keypup's SaaS solution allows engineering teams and all software development stakeholders to gain a better understanding of their engineering efforts by combining real-time insights from their development and project management platforms. The solution integrates multiple data sources into a unified database along with a user-friendly dashboard and insights builder interface. Keypup users can customize tried-and-true templates or create their own reports, insights, and dashboards to get a full picture of their development operations at a glance, tailored to their specific needs.

‍

---

Code snippets hosted with ❤ by GitHub