Skip to content
Go back

Cloudflare Workers: Edge Without the PhD

By SumGuy 6 min read
Cloudflare Workers: Edge Without the PhD

Your CDN Has Been Hiding a Superpower

Most people use Cloudflare as a CDN and DNS manager. Proxy traffic, cache some stuff, maybe set up a page rule or two. Fine. Useful. Boring.

But buried inside that dashboard is something genuinely powerful: Cloudflare Workers. JavaScript (or WASM) that runs at Cloudflare’s edge — in 300+ datacenters worldwide — before the request ever touches your origin server.

No cold starts. No containers to manage. No VPC config. And a free tier that’s actually usable: 100,000 requests per day, 10ms CPU per request, zero cost.

Here’s the thing — this isn’t “serverless” as a buzzword. It’s serverless as in: you write a function, deploy it in 2 minutes, and it runs 20ms from your users globally. Let’s dig in.


What Workers Actually Are

A Worker is a JavaScript function that intercepts HTTP requests. It runs in Cloudflare’s V8 isolates — same engine as Chrome, not Node.js — at whichever edge location is closest to your visitor.

The mental model:

User Request → Cloudflare Edge (your Worker runs here) → Origin Server

Your Worker can inspect, modify, respond to, or forward that request. It can also just be the response — no origin needed at all.

The 10ms CPU limit sounds scary until you understand what it actually means: 10ms of compute time, not 10ms of wall time. A Worker can wait 30 seconds for a fetch() to return. The clock only runs when your code is actually executing. For most use cases — header injection, redirects, auth checks, routing — you’ll use under 1ms of CPU.


Getting Started: Two Minutes to Hello World

Install wrangler and scaffold a project:

Terminal window
npm install -g wrangler
wrangler login
wrangler init my-worker
cd my-worker

The generated src/index.js looks like this:

src/index.js
export default {
async fetch(request, env, ctx) {
return new Response("Hello from the edge!", {
headers: { "Content-Type": "text/plain" },
});
},
};

Deploy it:

Terminal window
wrangler deploy

That’s it. You now have a globally distributed function running at https://my-worker.your-subdomain.workers.dev. No Dockerfile. No ECS cluster. No bill.


Real Use Cases (Not Toy Examples)

Redirect Manager

Managing redirects without touching your server config or redeploying your app:

src/index.js
const REDIRECTS = {
"/old-blog": "https://sumguy.com/blog",
"/discord": "https://discord.gg/yourinvite",
"/download": "https://github.com/you/repo/releases/latest",
};
export default {
async fetch(request) {
const url = new URL(request.url);
const destination = REDIRECTS[url.pathname];
if (destination) {
return Response.redirect(destination, 301);
}
return fetch(request);
},
};

Update the map, redeploy, done. Your origin server never knew anything changed.

Security Headers Without Touching Origin

Add CSP, HSTS, and friends to every response — even if your origin is a static S3 bucket that can’t set headers:

src/index.js
export default {
async fetch(request) {
const response = await fetch(request);
const newHeaders = new Headers(response.headers);
newHeaders.set("Strict-Transport-Security", "max-age=31536000; includeSubDomains");
newHeaders.set("X-Content-Type-Options", "nosniff");
newHeaders.set("Referrer-Policy", "strict-origin-when-cross-origin");
newHeaders.set(
"Content-Security-Policy",
"default-src 'self'; script-src 'self' 'unsafe-inline'"
);
return new Response(response.body, {
status: response.status,
headers: newHeaders,
});
},
};

Security team happy. No nginx config touched. No 2 AM deployment.

Geo-Based Routing

Cloudflare injects request metadata automatically — including the visitor’s country:

src/index.js
export default {
async fetch(request) {
const country = request.cf?.country;
if (country === "DE" || country === "FR" || country === "NL") {
return fetch("https://eu.your-origin.com" + new URL(request.url).pathname, request);
}
return fetch("https://us.your-origin.com" + new URL(request.url).pathname, request);
},
};

EU users hit your EU origin, everyone else hits US. GDPR team sends a fruit basket.

API Auth Proxy

Your third-party API key lives in Workers secrets — never exposed to the browser:

src/index.js
export default {
async fetch(request, env) {
const url = new URL(request.url);
// Only proxy /api/* paths
if (!url.pathname.startsWith("/api/")) {
return new Response("Not found", { status: 404 });
}
const upstreamUrl = "https://api.thirdparty.com" + url.pathname;
const upstreamRequest = new Request(upstreamUrl, {
method: request.method,
headers: {
...Object.fromEntries(request.headers),
"Authorization": `Bearer ${env.API_KEY}`,
},
body: request.body,
});
return fetch(upstreamRequest);
},
};

Set API_KEY via wrangler secret put API_KEY. Gone from your frontend. Gone from your git history.


KV Storage: Lightweight State at the Edge

Workers are stateless by default, but Workers KV gives you a key-value store that’s globally replicated. Think Redis, but distributed to every edge location.

src/index.js
export default {
async fetch(request, env) {
// Read a feature flag
const flagEnabled = await env.CONFIG.get("feature_new_checkout");
if (flagEnabled === "true") {
return fetch("https://origin.com/v2" + new URL(request.url).pathname);
}
return fetch(request);
},
};

KV is eventually consistent (updates propagate in ~60 seconds) and optimized for high reads, low writes. Perfect for: feature flags, config values, URL shortener mappings, caching expensive API responses. Not great for: counters updated thousands of times per second or anything needing strong consistency.


Workers vs Lambda vs Vercel Edge Functions

WorkersLambdaVercel Edge
RuntimeV8 IsolatesNode/Python/etcV8 Isolates
Cold startsNoneYes (100ms-2s)None
Locations300+~30 regions~70 locations
Free tier100k req/day1M req/month1M req/month
CPU limit10ms15 min50ms
Pricing$5/10M reqPay per ms$20/1M req

Lambda wins on raw execution time — 15 minutes vs 10ms CPU is a different category of workload. Vercel Edge is nicer if you’re already in the Next.js ecosystem. Workers wins on distribution, cold start, and price for high-volume edge logic.


When Workers Is the Wrong Tool

Be honest with yourself here. Workers is not for:

The sweet spot is request transformation, routing, auth, and edge caching. Thin, fast, stateless logic that benefits from being close to your users.


The Practical Bottom Line

If you’re already behind Cloudflare, you have Workers. Right now. For free. And it can replace a bunch of infrastructure you’re either paying for or maintaining manually.

Redirects that used to require an nginx config? Worker. Security headers that required a deployment? Worker. API key exposure that kept your security lead up at night? Worker.

The 10ms CPU limit keeps you honest — it forces you to keep Workers doing what they’re good at. And what they’re good at, they do better than anything else on the market.

Start with a redirect manager or header injector. Deploy it in an afternoon. Then you’ll start seeing Workers-shaped holes in your stack everywhere.

Your CDN was hiding a superpower the whole time. Might as well use it.


Share this post on:

Send a Webmention

Written about this post on your own site? Send a webmention and it may appear here.


Previous Post
Alert Fatigue: Why Your Alerts Are Meaningless
Next Post
LLM Temperature and top_p Explained Without the Math

Related Posts