- Workers run JavaScript/TypeScript at 300+ edge locations worldwide — cold starts under 1ms
- Use Hono (recommended) or fastify for routing, middleware, and REST APIs
- Workers AI runs LLM inference at the edge (Llama 3, Mistral, Gemma)
- KV stores key-value data; R2 stores files (S3-compatible); D1 is a SQLite edge database
- Durable Objects provide consistent stateful actors on the edge
- Queues handle async job processing; Hyperdrive connects to existing databases
- Deploy with
wrangler deploy; dev locally withwrangler dev
Quick reference tables
Installation & setup
| Task | Command |
|---|---|
| Install Wrangler | npm install -g wrangler |
| Create project | wrangler generate my-worker |
| Dev locally | wrangler dev |
| Deploy | wrangler deploy |
| Login to Cloudflare | wrangler login |
| Check who you’re logged in as | wrangler whoami |
| Tail logs | wrangler tail |
Core CLI
| Command | What it does |
|---|---|
wrangler dev --local | Dev without authenticating |
wrangler deploy --dry-run | Preview deploy without publishing |
wrangler deploy --env staging | Deploy to staging environment |
wrangler secret put API_KEY | Set a secret via CLI |
wrangler kv:namespace create NAMESPACE | Create a KV namespace |
wrangler d1 create DATABASE | Create a D1 database |
wrangler r2 bucket create BUCKET | Create an R2 bucket |
wrangler pipelines create QUEUE | Create a Queue |
wrangler rollback --version=3 | Rollback to a specific version |
Your first Worker
Minimal Worker
// src/index.ts
export interface Env {
// Bindings are auto-typed here
}
export default {
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
return new Response('Hello from the edge!');
},
}; Using Hono (recommended)
npm create hono my-worker -- --template cloudflare-workers // src/index.ts
import { Hono } from 'hono';
import { cors } from 'hono/cors';
const app = new Hono<{ Bindings: { DB: D1Database } }>();
app.use('*', cors());
app.get('/', (c) => c.text('Hello from Hono on the edge!'));
app.get('/api/users/:id', async (c) => {
const id = c.req.param('id');
const user = await c.env.DB
.prepare('SELECT * FROM users WHERE id = ?')
.bind(id)
.first();
return c.json(user);
});
app.post('/api/users', async (c) => {
const { name, email } = await c.req.json();
await c.env.DB
.prepare('INSERT INTO users (name, email) VALUES (?, ?)')
.bind(name, email)
.run();
return c.json({ success: true });
});
export default {
fetch: app.fetch,
}; Environment bindings (wrangler.toml)
name = "my-worker"
main = "src/index.ts"
compatibility_date = "2026-01-01"
# KV Namespace
[[kv_namespaces]]
binding = "CACHE"
id = "abc123def456..."
# R2 Bucket
[[r2_buckets]]
binding = "ASSETS"
bucket_name = "my-assets-bucket"
# D1 Database
[[d1_databases]]
binding = "DB"
database_name = "my-database"
database_id = "789abc..."
# Durable Objects
[[durable_objects.bindings]]
name = "SESSION_STORE"
class_name = "SessionStore"
# Queues
[[queues.producers]]
queue = "email-jobs"
# Hyperdrive
[[hyperdrive.bindings]]
binding = "DB_PROXY"
id = "hyperdrive_connection_id" KV (Key-Value Store)
export default {
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
const cache = env.CACHE;
// Read
const cached = await cache.get('homepage-html');
if (cached) {
return new Response(cached, {
headers: { 'Content-Type': 'text/html', 'X-Cache': 'HIT' },
});
}
// Write
const html = await generateHomepage();
await cache.put('homepage-html', html, { expirationTtl: 3600 });
return new Response(html, {
headers: { 'Content-Type': 'text/html', 'X-Cache': 'MISS' },
});
},
};
// List keys with prefix
const keys = await cache.list({ prefix: 'user:', limit: 100 });
// Delete
await cache.delete('homepage-html'); R2 (Object Storage)
export default {
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
const assets = env.ASSETS;
if (request.method === 'POST') {
// Upload
const formData = await request.formData();
const file = formData.get('file') as File;
await assets.put(file.name, file.stream(), {
httpMetadata: { contentType: file.type },
customMetadata: { uploadedBy: 'user-123' },
});
return c.json({ url: `/files/${file.name}` });
}
if (request.method === 'GET') {
const key = new URL(request.url).pathname.replace('/files/', '');
const object = await assets.head(key);
if (!object) return c.json({ error: 'Not found' }, 404);
const stream = await assets.get(key);
return new Response(stream?.body, {
headers: {
'Content-Type': object.httpMetadata.contentType ?? 'application/octet-stream',
'Content-Length': object.size.toString(),
},
});
}
},
};
// Delete
await assets.delete('filename.jpg');
// List
const listed = await assets.list({ prefix: 'uploads/', limit: 50 });
for (const obj of listed.objects) {
console.log(obj.key, obj.size, obj.uploaded);
} D1 (SQLite on the Edge)
-- schema.sql
CREATE TABLE IF NOT EXISTS users (
id TEXT PRIMARY KEY,
name TEXT NOT NULL,
email TEXT UNIQUE NOT NULL,
created_at INTEGER DEFAULT (unixepoch())
);
CREATE TABLE IF NOT EXISTS posts (
id TEXT PRIMARY KEY,
user_id TEXT NOT NULL,
title TEXT NOT NULL,
content TEXT NOT NULL,
published INTEGER DEFAULT 0,
FOREIGN KEY (user_id) REFERENCES users(id)
);
CREATE INDEX idx_posts_user ON posts(user_id);
CREATE INDEX idx_posts_published ON posts(published); # Create and apply schema
wrangler d1 create my-db
wrangler d1 execute my-db --file=schema.sql --remote
wrangler d1 execute my-db --command="SELECT * FROM users" --remote // Query from a Worker
const stmt = env.DB.prepare('SELECT * FROM posts WHERE published = 1 ORDER BY created_at DESC');
const { results } = await stmt.all();
const user = await env.DB
.prepare('SELECT * FROM users WHERE id = ?')
.bind(userId)
.first();
await env.DB
.prepare('INSERT INTO posts (id, user_id, title, content) VALUES (?, ?, ?, ?)')
.bind(crypto.randomUUID(), userId, title, content)
.run(); Durable Objects
// src/durable.ts
export class SessionStore implements DurableObjectNamespace {
private state: DurableObjectState;
private sessions: Map<string, object> = new Map();
constructor(state: DurableObjectState) {
this.state = state;
// Load persisted state
this.state.storage.get('sessions').then((data) => {
if (data) this.sessions = new Map(Object.entries(data));
});
}
async fetch(request: Request): Promise<Response> {
const url = new URL(request.url);
if (url.pathname === '/set') {
const { key, value } = await request.json();
this.sessions.set(key, value);
await this.state.storage.put('sessions', Object.fromEntries(this.sessions));
return new Response(JSON.stringify({ success: true }));
}
if (url.pathname === '/get') {
const key = url.searchParams.get('key');
return new Response(JSON.stringify(this.sessions.get(key ?? '')));
}
return new Response('Session Store — use /set or /get', { status: 200 });
}
} # wrangler.toml
[[durable_objects.bindings]]
name = "SESSION_STORE"
class_name = "SessionStore" // From a Worker
const id = env.SESSION_STORE.idFromName('user-123-session');
const stub = env.SESSION_STORE.get(id);
const response = await stub.fetch('/get?key=lastAction'); Workers AI (LLM Inference)
import { Ai } from '@cloudflare/ai-utils';
export default {
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
const ai = new Ai(env.AI);
// Text generation
const response = await ai.run('@cf/meta/llama-3-8b-instruct', {
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'Explain edge computing in one sentence.' },
],
max_tokens: 128,
});
return c.json({ response: response.response });
},
}; Other available models:
@cf/meta/llama-3-8b-instruct— Llama 3 8B@cf/meta/llama-3.3-70b-instruct-faster— Llama 3.3 70B@cf/mistral/mistral-7b-instruct-v0.2— Mistral 7B@cf/google/gemma-3-4b-it— Gemma 3 4B@cf/deepseek-ai/DeepSeek-V3-0.1— DeepSeek V3@cf/openai/whisper— Speech-to-text@cf/runwayml/stable-diffusion-xl-base-1.0— Image generation
Queues
// Producer — send a job
export default {
async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
const body = await request.json();
await env.EMAIL_JOURS.producer.send({
to: body.email,
subject: body.subject,
body: body.message,
});
return c.json({ queued: true });
},
};
// Consumer Worker — process jobs
export default {
async queue(batch: MessageBatch, env: Env, ctx: ExecutionContext): Promise<void> {
for (const message of batch.messages) {
const job = message.body;
await sendEmail(job.to, job.subject, job.body);
message.ack();
}
},
}; # wrangler.toml
[[queues.consumers]]
queue = "email-jobs"
max_batch_size = 10
max_batch_timeout = 30 Caching
// Cache API (CDN)
const cache = caches.default;
let response = await cache.match(request);
if (!response) {
response = await fetch(request);
response = new Response(response.body, response);
await cache.put(request, response.clone(), {
expirationTtl: 3600, // 1 hour TTL
staleWhileRevalidate: 2026-05-15 86400, // Serve stale for 24h while revalidating
});
} Summary
wrangler devfor local dev;wrangler deployto publish- Hono + TypeScript = best DX for building APIs
- KV for fast key-value reads; D1 for SQLite queries; R2 for files
- Durable Objects for consistent stateful singletons
- Workers AI for LLM inference at the edge
- Queues for async processing; Cache API for CDN caching
FAQ
How cold starts compare to AWS Lambda? Workers cold starts are under 1ms — orders of magnitude faster than Lambda’s 100–500ms. This is because Workers uses V8 isolates (not containers) for startup.
What is the 30-second CPU limit? Workers have a 30-second CPU time limit per request (not wall-clock time). Long-running tasks should use Queues or Durable Objects for background work.
Can I use Node.js modules in Workers?
Workers use the WinterCG subset of Web APIs. Node.js polyfills are available via nodejs_compat. Standard npm packages that depend on native modules (like sharp for image processing) don’t work — use Cloudflare Images or R2 for media.
How does Workers AI pricing work? Workers AI has per-token and per-request pricing depending on the model. Free tier includes limited inference. Check cloudflare.com/developers/documentation for current pricing.
What is compatibility_date?
compatibility_date pins the runtime behavior to a specific date. Update it to get new features and fixes without changing your code. Always set it to today or recent.
What to read next
- Docker Cheat Sheet — containerize Workers with custom runtimes
- GitHub Actions Cheat Sheet — CI/CD pipeline for Workers deployment
- TypeScript Cheat Sheet — type-safe Workers with Hono
Related Articles
Deepen your understanding with these curated continuations.
TypeScript Cheat Sheet: Types, Generics & Utilities
The essential TypeScript reference for primitive types, generics, and utility types. Learn type guards, mapped types, and optimal tsconfig configurations.
ClickHouse Cheat Sheet: Database Commands, Schema Design & Performance (2026)
Complete ClickHouse reference — SELECT patterns, aggregations, array functions, window functions, schema design, materialized views, row-level security, and performance tuning in 2026.
jq Cheat Sheet: Filter, Transform, and Query JSON from the Terminal
Complete jq reference — basic filters, pipes, array/object operations, conditionals, string interpolation, select, map, reduce, and real DevOps workflows.