I want you to imagine a scenario.
You walk into a coffee shop and order a Latte, the barista makes it, hands it to you and you take a sip, it tastes perfect. You turn around to grab a napkin, turn back to the counter and the latte is gone...
You ask for it again. The barista sighs, grinds the beans again, steams the milk again, and hands you a new cup.
This sounds ridiculous, right? Yet this is exactly how many frontend applications behave today.
We fetch a list of products. The user navigates to a product detail page, then clicks "Back" only to meet a Spinner because the list is being fetched again.
Welcome to Part 1 of my series on Frontend Caching. Before we dive into caching with tools like Tanstack Query, RTK Query, or Next.js (which we will cover in later articles), we need to fix our mindset.
We need to stop optimizing for "easy coding" and start optimizing for "instant experiences".
Why Caching is Not Optional
For a long time, caching was seen as a backend concern (Redis, database caching) or a purely "performance optimization" task you do at the end of a project.
But that's not the case, Caching affects user experience and cost, so should be taken into consideration at the initial stage of building.
Caching solves three specific problems:
Latency (The UX Killer): Even on 5G, a network request takes time. Reading from memory takes nanoseconds.
Bandwidth (The Cost Killer): Why download the same 500KB JSON payload five times in one session? Those extra requests are costing unnecessary extra money
Server Load (The Scale Killer): Your backend engineers will thank you if you stop hammering the API for data that hasn't changed.
The Core Concept: Fresh vs Stale Data
To understand caching, you must understand the concept of Staleness.
In a perfect world, our frontend would always be perfectly synced with the database. But the moment you fetch data to the client, it is technically stale. Something could have changed on the server the millisecond after your request finished.
Hence, Caching is the art of accepting staleness for a defined period.
This is the mental shift, instead of asking: "Is this data perfectly up to date?" You should ask: "Is this data fresh enough for the user right now?"
If I’m looking at a list of blog posts, does it matter if a new post was added 2 seconds ago? Probably not! I can show the cached list instantly as that data is stale but acceptable.
But If I’m looking at my bank account balance? That better be fresh
The Three Layers of Caching
As frontend engineers, we usually deal with three distinct layers. Understanding where your data lives is half the battle.
1. The Browser Cache (HTTP)
This is the invisible layer. Before your React or Vue code even runs, the browser checks if it already has the resource. This is controlled by HTTP headers (Cache-Control, ETag and the likes).
The Browser Cache is usually automatic if configured correctly on the server, though hard to manipulate manually from JavaScript.
2. The CDN / Edge Cache
This lives between your user and your server. It stores copies of your content closer to the user geographically.
One advantage is fast delivery of static assets, though if you cache dynamic API responses here, you risk showing different user different data which can be a nightmare to debug
3. Application Memory (The Client State)
This is where we will focus most of this series. This is using React Query, RTK Query and the likes, or even a simple object in your code.
Here you have instant access and complete control over invalidation logic. Though it vanishes when the user refreshes the page (unless you persist it).
The Hardest Problem in Computer Science - Invalidation
You’ve likely heard the quote by Phil Karlton:
There are only two hard things in Computer Science: cache invalidation and naming things.
Invalidation is knowing when to delete the cache.
Imagine you cache a User Profile. The user updates their name on a Settings page.
You send the PUT request to update the name.
The server confirms it's updated.
The Trap is this: You navigate back to the Profile page, and your cache still holds the old name now the user thinks the app is broken.
You have to tell the cache: "Hey, the data associated with User Profile is now dirty. Fetch it again next time."
Technology has evolved so well that we have tools that aids us build better performant web applications...
What's Next?
In this series, we'll be moving from "No idea" to "Master". We aren't just going to learn how to use the tools; we are going to learn why they work the way they do.
Here is the roadmap:
Part 2: The Invisible Layer – Mastering HTTP Headers (because you can't fix with JS what you broke with Headers).
Part 3: The State Manager Revolution – deep dive into Tanstack Query.
Part 4: The Redux Powerhouse – Implementation with RTK Query.
Part 5: The Full Stack – Server-side caching in Next.js.
See you in Part 2.

Top comments (0)