Drupal
min read
Last update on

Building custom Caching services in Drupal

Building custom Caching services in Drupal
Table of contents

The need for custom solutions

Drupal’s default caching mechanisms include cache.render and cache.page are designed for general use cases. They work well for standard content delivery, reducing load times and database queries across a broad range of applications. However, complex applications often require specialised caching logic that transcends these built-in systems.

Take, for instance, a financial reporting platform that processes thousands of data points per request. Default caching might reduce some backend calls, but it doesn’t address the computational overhead of transforming raw financial data into user-specific reports with real-time accuracy. Similarly, consider a personalised learning platform that dynamically assembles course content based on a learner’s history, progress, and preferences. In both cases, caching needs to go beyond what Drupal offers out of the box.

Default strategies fall short when performance demands collide with precision and dynamic output. In such scenarios, a custom caching service becomes essential, engineered to recognise context-specific data dependencies, invalidate intelligently, and serve pre-processed results without compromising on accuracy or personalisation.

Limitations of default Caching

Default caching systems are optimised for simplicity, but they struggle with niche requirements:

  • Static vs. dynamic content: While static content (like product descriptions) benefits from broad caching, dynamic data (such as stock prices) requires fine-grained invalidation rules.
  • Complex data structures: Nested JSON responses or hierarchical datasets may need specialised serialisation formats or invalidation triggers.
  • Contextual variations: Default contexts like user. roles or url. path cannot accommodate business-specific variations (for example, caching based on user subscription tiers).
  • Third-party integrations: External APIS or microservices often demand unique caching strategies to reduce latency and avoid rate limits.

When custom Caching adds value

Custom services shine in scenarios where default caching falls short:

  • High-traffic, low-change content: E-commerce product listings that update infrequently but receive frequent visits.
  • External API reliance: Applications that pull data from third-party services and need to mitigate network latency.
  • Granular invalidation: Systems requiring precise cache clearing (such as refreshing only reports tied to a specific user).

Avoid custom caching for rapidly changing data (like live chat messages) or sensitive information (such as user-specific financial records), where default mechanisms or no caching at all may be safer.

Designing a custom Caching service

Core components explained

A custom caching service in Drupal relies on four foundational elements:

1. Cache backend
This defines where data is stored. Common backends include:

  • Database: Default storage (cache.default), suitable for small-scale applications.
  • Memory stores: APCu or Redis for faster access in single-server setups.
  • Distributed Caches: Redis clusters or Memcached for multi-server environments.

2. Cache tags
Tags determine what triggers cache invalidation. For example:

  • node:123 clears caches tied to a specific node.
  • financial_report:2023 targets caches related to annual financial data.

Avoid broad tags like content_type: article, which can inadvertently clear unrelated caches.

Read more about cache tags.

3. Cache contexts
Contexts define when to vary the cache. For instance:

  • user.roles creates separate caches for admins vs. guests.
  • Language ensures multilingual sites serve content in the correct language.

Use narrow contexts (like url. path instead of url) to prevent cache bloat. Note: For custom cache bins, you must include context values in your cache ID, as cache contexts are not natively supported outside render caching.

More on cache contexts.

4. Cache metadata
Metadata includes rules like max-age (how long to store data) and dependencies (e.g., tags, contexts). This metadata ensures cached data remains fresh and relevant.

Dependency injection: why it matters

Drupal’s service container allows caching logic to be decoupled from business logic. By injecting dependencies like CacheBackendInterface and CacheContextsManager, services become:

  • Testable: Easily mocked in unit tests.
  • Flexible: Replace backends (for example, switching from database to Redis) without rewriting logic.
use Drupal\Core\Cache\CacheBackendInterface;
use Drupal\Core\Cache\Context\CacheContextsManager;

class CustomCacheService {
  protected $cacheBackend;
  protected $cacheContextsManager;

  public function __construct(
    CacheBackendInterface $cache_backend,
    CacheContextsManager $cache_contexts_manager
  ) {
    $this->cacheBackend = $cache_backend;
    $this->cacheContextsManager = $cache_contexts_manager;
  }
}

Learn about dependency injection in Drupal.

Step-by-step implementation

1. Register the service

Define the service in mymodule.services.yml:

services:
  mymodule.custom_cache:
    class: Drupal\mymodule\Cache\CustomCacheService
    arguments:
      - '@cache.default'
      - '@cache.contexts_manager'

This binds the service to Drupal’s dependency injection system, enabling reusability and testability.

2. Implement Caching logic

Create a method to fetch and cache data. Note: For custom cache bins, you must handle cache contexts by including them in the cache ID if needed.

public function getCachedData($cache_id, $callback, $options = []) {
  $options += [
    'expire' => \Drupal\Core\Cache\CacheBackendInterface::CACHE_PERMANENT,
    'tags' => ['my_custom_cache'],
  ];
  if ($cached = $this->cacheBackend->get($cache_id)) {
    return $cached->data;
  }

  $data = $callback();
  $this->cacheBackend->set(
    $cache_id,
    $data,
    $options['expire'],
    $options['tags']
  );

  return $data;
}

How It works:

  • Check: First, the service checks if the cache exists.
  • Generate: If not, the expensive operation (such as an API call) runs.
  • Store: Results are saved with metadata for future reuse.

See the official Cache API usage guide.

Multi-level Caching: solving complex bottlenecks

Layered Caching Strategy

Multi-level caching balances speed and resilience by combining storage tiers:

  • Memory Cache: Fastest but volatile (e.g., APCu).
  • Local persistent Cache: Survives server restarts (e.g., Redis on a single server).
  • Distributed Cache: Scales across servers (e.g., Redis clusters).
class MultiLevelCacheService {
  protected $memoryCache = [];
  protected $localCache;
  protected $distributedCache;

  public function __construct($localCache, $distributedCache) {
    $this->localCache = $localCache;
    $this->distributedCache = $distributedCache;
  }

  public function getData($key, $fetchCallback) {
    if (isset($this->memoryCache[$key])) {
      return $this->memoryCache[$key];
    }

    if ($local_cached = $this->localCache->get($key)) {
      $this->memoryCache[$key] = $local_cached->data;
      return $local_cached->data;
    }

    if ($distributed_cached = $this->distributedCache->get($key)) {
      $this->localCache->set($key, $distributed_cached->data, \Drupal\Core\Cache\CacheBackendInterface::CACHE_PERMANENT);
      $this->memoryCache[$key] = $distributed_cached->data;
      return $distributed_cached->data;
    }

    $data = $fetchCallback();

    $this->memoryCache[$key] = $data;
    $this->localCache->set($key, $data, \Drupal\Core\Cache\CacheBackendInterface::CACHE_PERMANENT);
    $this->distributedCache->set($key, $data, \Drupal\Core\Cache\CacheBackendInterface::CACHE_PERMANENT);

    return $data;
  }
}

Use case:
A news site with high traffic on its home page. The memory cache handles immediate requests, while the distributed layer ensures consistency across a server cluster.

Real-world example: Caching external API data

Scenario

An e-commerce platform pulls product pricing from a third-party API. Without caching, each request takes 2 seconds due to network latency.

Solution

Add a caching layer to reduce latency:

class ApiCacheService extends CustomCacheService {
  protected $apiClient;

  public function __construct($cache_backend, $cache_contexts_manager, $apiClient) {
    parent::__construct($cache_backend, $cache_contexts_manager);
    $this->apiClient = $apiClient;
  }

  public function fetchWithCaching($endpoint, $params = []) {
    $cache_id = 'api_cache:' . md5($endpoint . serialize($params));

    return $this->getCachedData($cache_id, function() use ($endpoint, $params) {
      try {
        $response = $this->apiClient->request($endpoint, $params);
        return $this->processApiResponse($response);
      } catch (\Exception $e) {
        \Drupal::logger('api_cache')->error('API fetch failed: @message', ['@message' => $e->getMessage()]);
        return null;
      }
    }, [
      'expire' => REQUEST_TIME + 1800,
      'tags' => ['external_api_data'],
    ]);
  }

  protected function processApiResponse($response) {
    // Transform or sanitise as needed.
    return $response;
  }
}

Impact:

  • Latency reduction: Cuts API call time from 2s to near-instant.
  • Graceful degradation: Falls back to cached data during API outages.

Error handling and fallbacks

Resilience through fallback layers

When a cache backend fails (e.g., Redis crashes), a fallback mechanism prevents downtime.

public function getFallbackData($primary_key, $fallback_callback) {
  try {
    return $this->getCachedData($primary_key, $fallback_callback);
  } catch (\Exception $e) {
    \Drupal::logger('cache_service')->warning('Primary cache failed, using fallback');
    return $fallback_callback();
  }
}

Why it works:

  • User experience: Keeps the site functional during backend failures.
  • Operational continuity: Ensures critical features remain accessible.

Performance monitoring

Tracking what matters

Create a tracker to measure cache effectiveness:

class CachePerformanceTracker {
  protected $stats = ['hits' => 0, 'misses' => 0, 'generation_time' => []];

  public function trackCachePerformance($cache_id, $callback) {
    $start = microtime(true);
    $result = $callback();
    $this->stats['generation_time'][$cache_id] = microtime(true) - $start;
    return $result;
  }

Metrics to monitor:

  • Hit rate: High hit rates indicate effective caching.
  • Generation time: Identify slow-to-generate content for optimisation.

Best practices and common pitfalls

Best practices

  • Granular tags: Use financial_report:2023 instead of broad tags.
  • Minimise contexts: Prefer url. path over url to reduce variants.
  • Avoid sensitive data: Never cache CSRF tokens or PII.
  • Monitor: Track hit rates and generation times for proactive optimisation.

Common pitfalls

  • Overly broad contexts: Using url instead of url.path creates redundant cache entries.
  • No fallbacks: A cache failure should not break the site.

Series navigation

This article is part of our comprehensive 10-part series on Drupal caching:

  1. Introduction to Drupal Caching
  2. Understanding Drupal’s Cache API
  3. Choosing the Right Cache Backend for Your Drupal Site
  4. Mastering Page and Block Caching in Drupal
  5. Optimising Drupal Views and Forms with Caching
  6. Entity and Render Caching for Drupal Performance
  7. Building Custom Caching Services in Drupal (You are here)
  8. Implementing Custom Cache Bins for Specialised Needs (Coming soon)
  9. Advanced Drupal Cache Techniques (Coming soon)
  10. Drupal Caching Best Practices and Performance Monitoring (Coming soon)

What’s next?

While Drupal’s default caching covers the basics, scaling performance for advanced applications often means thinking beyond the core. Custom caching services allow developers to fine-tune how data is stored, served, and invalidated, especially when dealing with highly dynamic or resource-intensive workloads.

In the next blog, we’ll explore how to go one step further with custom cache bins. You’ll learn how to create isolated cache storage for niche scenarios like session-based data, configure these bins in settings.php, and fine-tune caching strategies for performance-critical use cases.

To dive deeper into Drupal’s caching capabilities in the meantime, check out the Cache API documentation, along with resources on Cache Tags, Cache Contexts, and Cacheable Metadata.

Written by
Editor
Ananya Rakhecha
Tech Advocate