Caching records by tags in Go

development golang cache

Cache Invalidation

When frequently accessed data which is stored in a cache is updated, it is necessary for the application to evict the stale data from the cache. So when subsequent requests for the same data is received by the application, it will retrieve fresh data from the database and store it in the cache. This way, users will never receive stale data from the application. The process of cache invalidation can be made easier by tagging data with specific tags when storing in the cache. This way, when the data needs to be invalidated, we can use the tag names to evict them from the cache.

Here is an implementation of caching records by tags. We will use Redis and package.

import (

type cache struct {
    client redis.Client

func (c *cache) SetByTags(ctx context.Context, key string, val interface{}, tags []string, expiry time.Duration) error {
	pipe := c.client.TxPipeline()
	for _, tag := range tags {
		pipe.SAdd(tag, key)
		pipe.Expire(tag, expiry)

	pipe.Set(key, b, expiry)

	_, errExec := pipe.Exec()
	return errExec

func (c *cache) Invalidate(ctx context.Context, tags []string) {
	keys := make([]string, 0)
	for _, tag := range tags {
		k, _ := c.client.SMembers(tag).Result()
		keys = append(keys, tag)
		keys = append(keys, k...)

In SetByTags method, we maintain a set data structure for each tag provided. We then add the given cache key into these sets. When invalidating cache by tags, we get all the members, i.e. cache keys, for the given tags and delete them from Redis.

Example usage:

tags := []string{"post1", "post2"}
value := "blog data"

key1 := "blog:one:post1"
key2 := "blog:one:post2"

// Set cache by tags here:
cache.SetByTags(ctx, key1, value, tags, 24 * time.Hour)
cache.SetByTags(ctx, key2, value, tags, 24 * time.Hour)

// Invalidate cache by tag "post1" here.
// Both key1 and key2 will be evicted, since both were tagged with "post1".
cache.Invalidate(ctx, []string{"post1"})


In case you would like to get notified about more articles like this, please subscribe to my substack.

Get new posts by email


comments powered by Disqus