Code Cache (or bytecode caching) is a critical optimization where JavaScript engines like V8 store the compiled version of a script after it's first parsed and compiled, allowing subsequent loads to skip the compilation step and start execution much faster .
When a JavaScript engine loads a script, it normally must parse the source code and compile it into bytecode before execution can begin. This process consumes significant CPU time and delays page interactivity. Code caching eliminates this overhead by saving the compiled bytecode after the first execution and reusing it on future visits . Modern browsers implement multi-tiered caching strategies to balance performance gains against storage costs, with Chrome's implementation achieving approximately 86% hit rates for cacheable scripts and reducing parse/compile time by 20-40% on real-world pages .
In-memory Isolate Cache: A fast hashtable within the current V8 instance (same tab/process) that stores compiled bytecode keyed to the source code. It achieves ~80% hit rates in practice and operates with minimal overhead, but cannot persist across browser sessions or different processes .
On-disk Serialized Cache: Managed by Chrome's Blink engine, this cache stores compiled code on disk attached to the HTTP resource cache. It survives browser restarts and can be shared across different tabs and processes, but requires multiple visits to populate (cold → warm → hot sequence) .
Modern code caching has evolved significantly. Prior to Chrome 59, caching had to happen before execution, which meant only eagerly-compiled functions (IIFEs) were cached. With V8's Ignition interpreter, the engine can now generate caches after top-level execution completes, capturing functions that were lazily compiled during runtime. This "improved code caching" increased cache coverage by 20-40% and reduced top-level page load metrics by 1-2% on mobile devices . The separation of generated code from context-dependent feedback vectors (inline caches) made this possible, as the bytecode itself remains context-agnostic while profiling data stays separate .
Browser-level caching: Chrome's code cache reduces JavaScript compilation time by 20-40% on most pages, with 86% of cacheable scripts hitting the cache .
Serverless platforms: Vercel's bytecode caching for Node.js functions improved cold start TTFB by 12-27% and reduced billed duration by 48-58% for applications between 250kB-800kB .
Embedded systems: Research on mobile devices showed in-memory bytecode caching reduced compilation time by 57.9% and execution time by 30.9% .
Node.js native modules: The Node.js executable pre-compiles built-in modules into an embedded code cache, eliminating parse/compile overhead when these modules are required at runtime .
Do nothing: Write clean, idiomatic code. Caching is a browser implementation detail with evolving heuristics .
Keep code stable: Avoid frequent deployments—each code change invalidates the cache (cold run required again). Use 304 responses to preserve cache when possible .
Maintain stable URLs: Changing script URLs (including query params) creates new cache entries, starting over from cold .
Split stable libraries: Separate rarely-changing library code into its own script so business logic changes don't invalidate library cache .
Avoid inline scripts: External scripts can be cached; inline scripts are tied to the HTML document and don't share across pages .
Group small files: Chrome ignores scripts under 1KB (overhead > benefit). Combine multiple tiny scripts to exceed threshold .
For platform engineers: Use eagerCompile=true when generating caches for hot paths (improves coverage), but disable for cold startup. Consider background threads for cache generation to avoid blocking main thread .
The fundamental trade-off in code caching is between coverage and freshness. Generating comprehensive caches requires executing more code (which takes time), but yields better subsequent performance. Modern engines strike this balance by caching after top-level execution, capturing commonly-used paths without forcing full eager compilation of every function . For serverless platforms with ephemeral storage, specialized implementations like Vercel's bytecode caching overcome the challenge of filesystem volatility by intelligently merging cache chunks as new code paths are discovered .