News & Announcements

Gain better insights into edge caching behavior on Netlify

News & Announcements

Gain better insights into edge caching behavior on Netlify

Over the last year, Netlify has made significant investment in making the caching features of the Netlify Edge even more powerful. These platform primitives unlock the ability to create dynamic digital experiences. Here’s what’s new:

  • We've added support for fine-grained cache control headers to precisely configure which content gets cached for how long and in which CDN provider.
  • Cache Tag support allows developers to selectively invalidate content to ensure a high cache hit rate for the overall site.
  • With the addition of Netlify-Vary, dynamic content can be cached selectively for different variations in request details like URL query, cookies or headers.

Cache-Status Response Header

Netlify now provides a new HTTP response header, Cache-Status, to offer insights into the caching process. Detailed in RFC 9211, this header includes directives that explain a cache’s behavior in generating an HTTP response.

Here are some example values that you might see in this header, and what they mean:

  • Cache-Status: "Netlify Edge"; hit - Response was served from the edge cache directly
  • Cache-Status: "Netlify Edge"; fwd=miss - Response was served from an origin because there was no cached object for this request
  • Cache-Status: "Netlify Edge"; fwd=stale - Response was served from an origin because the object in the cache was stale (invalidated or expired)
  • Cache-Status: "Netlify Edge"; hit, fwd=stale - A stale response was served from the edge cache, but a background refresh was triggered (because of the stale-while-revalidate cache directive)

Using information from Cache Status with synthetic tests

The information provided in 'Cache-Status' can be helpful when running synthetic testing against your Netlify site. Your synthetic test can check if the 'hit' directive is present in the 'Cache-Status' header or export it as a metric. Grouping latency measurements by cache status provides better data than when latency measurements from cached and uncached responses are mixed.

Cache status usage with Real User Monitoring (RUM)

When using a configurable Real User Monitoring (RUM) service, you can record the cache status as metadata with your measurements which helps you differentiate latency and error rate for cached vs uncached responses to fetch calls in your frontend code.

Keep reading

Recent posts

Book cover with the title Deliver web project 10 times faster with Jamstack enterprise

Deliver web projects 10× faster

Get the whitepaper