Fastify, with its focus on speed and low overhead, is a popular choice among web developers. Implementing an efficient caching strategy is crucial to enhance the performance of your applications further. In this blog post, I'll take you through implementing effective caching using Fastify.
Understanding caching
Caching involves storing copies of frequently accessed data in a temporary storage area to reduce redundant work and improve response times. There are two main cache types: client caching and server-side caching. Client caching involves telling the browser (or other client) if and for how long they can cache the asset. Server caching involves the process of caching a request on the server side. This can be useful in cases where you have queries in the database or perform complex calculations.
There are several caching strategies inside Fastify. In this blog I'll focus on how to implement in-memory caching in Fastify and how to implement caching using Redis in fastify.
Choosing the right strategy
The right strategy depends on your application and the amount of users you expect to serve. For smaller applications in-memory optimization might be sufficient as it provides a easy way to store caching, the primary downside to in-memory caching is the scalability of in-memory caching. In-memory caching lives on the server so if your application runs on a cluster the cache would not be shared between nodes.
For large applications or high traffic applications Redis would be a better solution, while initial setup is harder it is easier and cheaper to scale. As well as being abe to run it centralized so that each node in a cluster can access the cache.
Implementing in-memory caching in Fastify
Using in-memory caching is rather straight forward using the @fastify/caching plugin. As mentioned this strategy works best for small low traffic applications. Please note that the author advices against using this strategy since it is limitted to 100k items *note that your server might run out of memory to use before storing 100.000 items depending on the RAM available and the data you are storing. You can implement this strategy using the following code.
1
2
3
4
5
6
7
8
9
10
import fastify from 'fastify'
import cache from '@fastify/caching'
const app = fastify()
// You can provide a custom config to change the expiry and privacy options
app.register(cache)
app.listen({ port: 3000 }, (err) => {
if (err) throw err
})
Implementing a Redis cache in Fastify
You can implement a Redis cache using the same @fastify/caching plugin, however you will need to install the package abstract-cache-redis package. In addition you will need to setup a Redis server, for this I generally recommend using a managed database solution such as database clusters from DigitalOcean, with this link you'll receive $200 of credits for 60 days in DigitalOcean so that you can test out a Redis solution within DigitalOcean. Once you have a redis server running you can implement the Redis cache using the following code.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
import fastify from 'fastify'
import cache from '@fastify/caching'
import redisPlugin from 'fastify-redis'
import abstractCache from 'abstract-cache'
import IORedis from 'ioredis'
const redis = new IORedis({
host: 'YOUR_REDIS_SERVER'
})
const client = abstractCache({
driver: {
name: 'abstract-cache-redis',
options: {
client: redis
}
}
})
app.register(fastifyRedis, { client: redis })
app.register(cache, { cache: client })
app.listen({ port: 3000 }, (err) => {
if (err) throw err
})
Fine tune your caching strategy
It's crucial to fine-tune your cache policies to get the best possible performance. For this, you want to configure the cache to balance data freshness and reduce server load. It's crucial to set appropriate expiration times for cached data.
For frequently changing data, shorter expiration times ensure the cache stays current. On the other hand, for relatively static information, longer expiration times can significantly reduce the load on your server by serving cached content to users. Fastify's flexibility allows you to tailor these expiration times according to the specific needs of different routes or data types.
Handling cache invalidation is another crucial aspect. Fastify provides hooks and events that can trigger cache invalidation when data is updated or modified. This ensures that users always receive the latest information without compromising the benefits of caching. Additionally, consider employing versioning or unique cache keys to distinguish between different versions of cached data, especially when dealing with evolving datasets.
For real high-traffic applications, you can also consider creating multiple caching configurations and activating these depending on the server's load. For example, when you receive a lot more traffic than usual, you could automatically activate a caching configuration aimed at reducing the server load by compromising on data freshness.
By carefully adjusting these cache policies, you can balance performance and data freshness, ensuring that your Fastify application delivers an optimal user experience. Experimenting with different configurations and monitoring the results is key to finding the sweet spot for your specific use case.
Ensuring your Fastify cache is secure
It's very important to ensure that you don't server-side rendered pages that are authorised routes and then serve that cache to others. This could result in users seeing other users' user information. This is an easy mistake to make, and it happens at companies of each size, such as Steam, which, during Christmas 2015, returned sensitive user information to other users. A good rule to follow is never to cache the following cases unless you have strategies to ensure that the cache is only served to the correct user and the cache is invalidated when changes are made.
Don't cache authorised routes
Don't cache access control such as if someone has permission to delete something
Don't cache sensitive information
Conclusion: Striking the Perfect Balance - Optimizing Performance in Your Fastify Application
In conclusion, optimising the performance of your Fastify application through effective caching is a crucial step toward delivering a seamless user experience. Whether you choose in-memory caching for smaller applications or opt for the scalability of Redis for larger, high-traffic scenarios, selecting the right strategy is key.
Fine-tuning your caching strategy is equally important. Balancing data freshness and server load by setting appropriate expiration times, handling cache invalidation through Fastify's hooks and events, and considering multiple caching configurations for varying server loads are vital aspects. Careful adjustment of these policies ensures optimal performance and user experience.
In the next blog, I'll dive into how to manually implement caching for cases where you want more fine control of the caching implementation.