Skip to content
Go back

CI Pipeline Caching: Speed Up Every Build

By SumGuy 4 min read
CI Pipeline Caching: Speed Up Every Build

The Pain

You push code. CI starts. First step: npm install. Four minutes. Every. Single. Run.

Then yarn lock file changes once a month. Cache invalidates. Back to four minutes.

This is debugging at 2 AM while waiting for CI to finish. It’s a terrible use of time.

What You’re Caching (And Why)

Dependency Cache

node_modules/ is expensive to install. It’s thousands of files, network requests, package resolution. Caching it saves 3-4 minutes per build.

Build Cache

Compiled artifacts (Go binaries, Python wheels, Docker layers). Rebuilt every time? Wasteful. Build once, cache it.

Package Manager Cache

npm, pip, cargo all maintain local caches. Most CI systems don’t preserve them between runs.

GitHub Actions: Dependency Caching

Here’s a Node.js example:

name: CI
on: [push, pull_request]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '20'
# This is the magic
cache: 'npm'
- run: npm ci # Not 'npm install' — ci is deterministic
- run: npm test
- run: npm run build

That cache: 'npm' line:

  1. Hashes package-lock.json
  2. Restores node_modules/ if the hash matches
  3. Saves node_modules/ after npm ci

First build: 4 minutes. Subsequent builds (same lock file): 30 seconds. 12x faster.

With yarn or pnpm

- uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'yarn' # or 'pnpm'

Same strategy.

Python: pip Caching

name: Python CI
on: [push]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: '3.11'
cache: 'pip'
- run: pip install -r requirements.txt
- run: pytest

Same principle. requirements.txt hash → cache key. Restores venv dependencies if unchanged.

Generic Caching (Any Language)

For Go, Rust, or custom build systems:

- uses: actions/cache@v3
with:
path: ~/.cargo/registry
key: ${{ runner.os }}-cargo-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-
- run: cargo build --release

This says:

First build: compiles and downloads 500 crates. Second build: restores from cache, compiles in 10 seconds.

Docker: Layer Caching

Docker layer caching is built-in but easy to get wrong.

Bad (No Caching)

FROM node:20
WORKDIR /app
COPY . .
RUN npm install
RUN npm run build
CMD ["node", "index.js"]

Why is this bad? If any file changes, COPY . . invalidates the cache. Dependencies reinstall every time.

Good (Cache-Aware)

FROM node:20
WORKDIR /app
# Copy only package files
COPY package*.json ./
# Install — cached until lock file changes
RUN npm ci
# Copy app code
COPY . .
RUN npm run build
CMD ["node", "index.js"]

Now:

Multi-Stage with Caching

# Build stage
FROM node:20 AS builder
WORKDIR /app
COPY package*.json ./
RUN npm ci
COPY . .
RUN npm run build
# Runtime stage
FROM node:20
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY --from=builder /app/dist ./dist
CMD ["node", "dist/index.js"]

Benefits:

GitLab CI: Cache Strategy

image: node:20
cache:
key: ${CI_COMMIT_REF_SLUG}-npm
paths:
- node_modules/
- .npm/
stages:
- build
build:
stage: build
script:
- npm ci --cache .npm
- npm run build

The key includes the branch name. Different branches get different caches. Main builds fast. Feature branches don’t invalidate the main cache.

Common Mistakes

Mistake 1: Caching Too Much

cache:
paths:
- . # Everything!

Don’t. Cache only what’s expensive: dependencies, build artifacts. Not source code.

Mistake 2: Not Using Deterministic Install

Terminal window
npm install # Might install different versions
npm ci # Installs exact versions from lock file

Use npm ci (or pip install --require-hashes). Deterministic = cacheable.

Mistake 3: Ignoring Cache Size Limits

GitHub Actions: 5GB per repo cache. GitLab: configurable but watch disk usage.

Monitor cache growth. If node_modules/ is bloated, prune unused packages.

The Real Impact

Scenario: 20 developers, 50 CI runs per day, 4 minutes saved per run.

50 runs × 4 minutes × 20 devs = 4000 minutes/day
= 66 hours/day of CI
= $33/day (at $0.50/minute on cloud CI)

Caching that 4 minutes down to 30 seconds:

50 runs × 3.5 minutes × 20 devs = 3500 minutes/day
= 58 hours/day
= $29/day

$4/day saved. $1,460/year. Just from npm caching.

Add Docker layer caching, build artifact caching, and you’re looking at 50% reduction in CI time.

That’s not just money. It’s developer sanity.

Implement it. Your team will notice builds finishing 12 minutes earlier. Everyone wins.


Share this post on:

Send a Webmention

Written about this post on your own site? Send a webmention and it may appear here.


Previous Post
Disk Space Tools in 2026: Beyond du and df
Next Post
Apache in 2026: It's Time to Move On

Related Posts