MeshWorld India Logo MeshWorld.
Cheatsheet Cloudflare Workers Edge Serverless DevOps JavaScript TypeScript 8 min read

Cloudflare Workers Cheat Sheet: Edge Functions, Durable Objects & Storage

Vishnu
By Vishnu
| Updated: May 12, 2026
Cloudflare Workers Cheat Sheet: Edge Functions, Durable Objects & Storage
TL;DR
  • Workers run JavaScript/TypeScript at 300+ edge locations worldwide — cold starts under 1ms
  • Use Hono (recommended) or fastify for routing, middleware, and REST APIs
  • Workers AI runs LLM inference at the edge (Llama 3, Mistral, Gemma)
  • KV stores key-value data; R2 stores files (S3-compatible); D1 is a SQLite edge database
  • Durable Objects provide consistent stateful actors on the edge
  • Queues handle async job processing; Hyperdrive connects to existing databases
  • Deploy with wrangler deploy; dev locally with wrangler dev

Quick reference tables

Installation & setup

TaskCommand
Install Wranglernpm install -g wrangler
Create projectwrangler generate my-worker
Dev locallywrangler dev
Deploywrangler deploy
Login to Cloudflarewrangler login
Check who you’re logged in aswrangler whoami
Tail logswrangler tail

Core CLI

CommandWhat it does
wrangler dev --localDev without authenticating
wrangler deploy --dry-runPreview deploy without publishing
wrangler deploy --env stagingDeploy to staging environment
wrangler secret put API_KEYSet a secret via CLI
wrangler kv:namespace create NAMESPACECreate a KV namespace
wrangler d1 create DATABASECreate a D1 database
wrangler r2 bucket create BUCKETCreate an R2 bucket
wrangler pipelines create QUEUECreate a Queue
wrangler rollback --version=3Rollback to a specific version

Your first Worker

Minimal Worker

typescript
// src/index.ts
export interface Env {
  // Bindings are auto-typed here
}

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    return new Response('Hello from the edge!');
  },
};
bash
npm create hono my-worker -- --template cloudflare-workers
typescript
// src/index.ts
import { Hono } from 'hono';
import { cors } from 'hono/cors';

const app = new Hono<{ Bindings: { DB: D1Database } }>();

app.use('*', cors());

app.get('/', (c) => c.text('Hello from Hono on the edge!'));

app.get('/api/users/:id', async (c) => {
  const id = c.req.param('id');
  const user = await c.env.DB
    .prepare('SELECT * FROM users WHERE id = ?')
    .bind(id)
    .first();
  return c.json(user);
});

app.post('/api/users', async (c) => {
  const { name, email } = await c.req.json();
  await c.env.DB
    .prepare('INSERT INTO users (name, email) VALUES (?, ?)')
    .bind(name, email)
    .run();
  return c.json({ success: true });
});

export default {
  fetch: app.fetch,
};

Environment bindings (wrangler.toml)

toml
name = "my-worker"
main = "src/index.ts"
compatibility_date = "2026-01-01"

# KV Namespace
[[kv_namespaces]]
binding = "CACHE"
id = "abc123def456..."

# R2 Bucket
[[r2_buckets]]
binding = "ASSETS"
bucket_name = "my-assets-bucket"

# D1 Database
[[d1_databases]]
binding = "DB"
database_name = "my-database"
database_id = "789abc..."

# Durable Objects
[[durable_objects.bindings]]
name = "SESSION_STORE"
class_name = "SessionStore"

# Queues
[[queues.producers]]
queue = "email-jobs"

# Hyperdrive
[[hyperdrive.bindings]]
binding = "DB_PROXY"
id = "hyperdrive_connection_id"

KV (Key-Value Store)

typescript
export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    const cache = env.CACHE;

    // Read
    const cached = await cache.get('homepage-html');
    if (cached) {
      return new Response(cached, {
        headers: { 'Content-Type': 'text/html', 'X-Cache': 'HIT' },
      });
    }

    // Write
    const html = await generateHomepage();
    await cache.put('homepage-html', html, { expirationTtl: 3600 });

    return new Response(html, {
      headers: { 'Content-Type': 'text/html', 'X-Cache': 'MISS' },
    });
  },
};

// List keys with prefix
const keys = await cache.list({ prefix: 'user:', limit: 100 });

// Delete
await cache.delete('homepage-html');

R2 (Object Storage)

typescript
export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    const assets = env.ASSETS;

    if (request.method === 'POST') {
      // Upload
      const formData = await request.formData();
      const file = formData.get('file') as File;
      await assets.put(file.name, file.stream(), {
        httpMetadata: { contentType: file.type },
        customMetadata: { uploadedBy: 'user-123' },
      });
      return c.json({ url: `/files/${file.name}` });
    }

    if (request.method === 'GET') {
      const key = new URL(request.url).pathname.replace('/files/', '');
      const object = await assets.head(key);
      if (!object) return c.json({ error: 'Not found' }, 404);
      const stream = await assets.get(key);
      return new Response(stream?.body, {
        headers: {
          'Content-Type': object.httpMetadata.contentType ?? 'application/octet-stream',
          'Content-Length': object.size.toString(),
        },
      });
    }
  },
};

// Delete
await assets.delete('filename.jpg');

// List
const listed = await assets.list({ prefix: 'uploads/', limit: 50 });
for (const obj of listed.objects) {
  console.log(obj.key, obj.size, obj.uploaded);
}

D1 (SQLite on the Edge)

sql
-- schema.sql
CREATE TABLE IF NOT EXISTS users (
  id TEXT PRIMARY KEY,
  name TEXT NOT NULL,
  email TEXT UNIQUE NOT NULL,
  created_at INTEGER DEFAULT (unixepoch())
);

CREATE TABLE IF NOT EXISTS posts (
  id TEXT PRIMARY KEY,
  user_id TEXT NOT NULL,
  title TEXT NOT NULL,
  content TEXT NOT NULL,
  published INTEGER DEFAULT 0,
  FOREIGN KEY (user_id) REFERENCES users(id)
);

CREATE INDEX idx_posts_user ON posts(user_id);
CREATE INDEX idx_posts_published ON posts(published);
bash
# Create and apply schema
wrangler d1 create my-db
wrangler d1 execute my-db --file=schema.sql --remote
wrangler d1 execute my-db --command="SELECT * FROM users" --remote
typescript
// Query from a Worker
const stmt = env.DB.prepare('SELECT * FROM posts WHERE published = 1 ORDER BY created_at DESC');
const { results } = await stmt.all();

const user = await env.DB
  .prepare('SELECT * FROM users WHERE id = ?')
  .bind(userId)
  .first();

await env.DB
  .prepare('INSERT INTO posts (id, user_id, title, content) VALUES (?, ?, ?, ?)')
  .bind(crypto.randomUUID(), userId, title, content)
  .run();

Durable Objects

typescript
// src/durable.ts
export class SessionStore implements DurableObjectNamespace {
  private state: DurableObjectState;
  private sessions: Map<string, object> = new Map();

  constructor(state: DurableObjectState) {
    this.state = state;
    // Load persisted state
    this.state.storage.get('sessions').then((data) => {
      if (data) this.sessions = new Map(Object.entries(data));
    });
  }

  async fetch(request: Request): Promise<Response> {
    const url = new URL(request.url);

    if (url.pathname === '/set') {
      const { key, value } = await request.json();
      this.sessions.set(key, value);
      await this.state.storage.put('sessions', Object.fromEntries(this.sessions));
      return new Response(JSON.stringify({ success: true }));
    }

    if (url.pathname === '/get') {
      const key = url.searchParams.get('key');
      return new Response(JSON.stringify(this.sessions.get(key ?? '')));
    }

    return new Response('Session Store — use /set or /get', { status: 200 });
  }
}
toml
# wrangler.toml
[[durable_objects.bindings]]
name = "SESSION_STORE"
class_name = "SessionStore"
typescript
// From a Worker
const id = env.SESSION_STORE.idFromName('user-123-session');
const stub = env.SESSION_STORE.get(id);
const response = await stub.fetch('/get?key=lastAction');

Workers AI (LLM Inference)

typescript
import { Ai } from '@cloudflare/ai-utils';

export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    const ai = new Ai(env.AI);

    // Text generation
    const response = await ai.run('@cf/meta/llama-3-8b-instruct', {
      messages: [
        { role: 'system', content: 'You are a helpful assistant.' },
        { role: 'user', content: 'Explain edge computing in one sentence.' },
      ],
      max_tokens: 128,
    });

    return c.json({ response: response.response });
  },
};

Other available models:

  • @cf/meta/llama-3-8b-instruct — Llama 3 8B
  • @cf/meta/llama-3.3-70b-instruct-faster — Llama 3.3 70B
  • @cf/mistral/mistral-7b-instruct-v0.2 — Mistral 7B
  • @cf/google/gemma-3-4b-it — Gemma 3 4B
  • @cf/deepseek-ai/DeepSeek-V3-0.1 — DeepSeek V3
  • @cf/openai/whisper — Speech-to-text
  • @cf/runwayml/stable-diffusion-xl-base-1.0 — Image generation

Queues

typescript
// Producer — send a job
export default {
  async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
    const body = await request.json();
    await env.EMAIL_JOURS.producer.send({
      to: body.email,
      subject: body.subject,
      body: body.message,
    });
    return c.json({ queued: true });
  },
};

// Consumer Worker — process jobs
export default {
  async queue(batch: MessageBatch, env: Env, ctx: ExecutionContext): Promise<void> {
    for (const message of batch.messages) {
      const job = message.body;
      await sendEmail(job.to, job.subject, job.body);
      message.ack();
    }
  },
};
toml
# wrangler.toml
[[queues.consumers]]
queue = "email-jobs"
max_batch_size = 10
max_batch_timeout = 30

Caching

typescript
// Cache API (CDN)
const cache = caches.default;
let response = await cache.match(request);

if (!response) {
  response = await fetch(request);
  response = new Response(response.body, response);
  await cache.put(request, response.clone(), {
    expirationTtl: 3600,          // 1 hour TTL
    staleWhileRevalidate: 2026-05-15 86400,   // Serve stale for 24h while revalidating
  });
}

Summary

  • wrangler dev for local dev; wrangler deploy to publish
  • Hono + TypeScript = best DX for building APIs
  • KV for fast key-value reads; D1 for SQLite queries; R2 for files
  • Durable Objects for consistent stateful singletons
  • Workers AI for LLM inference at the edge
  • Queues for async processing; Cache API for CDN caching

FAQ

How cold starts compare to AWS Lambda? Workers cold starts are under 1ms — orders of magnitude faster than Lambda’s 100–500ms. This is because Workers uses V8 isolates (not containers) for startup.

What is the 30-second CPU limit? Workers have a 30-second CPU time limit per request (not wall-clock time). Long-running tasks should use Queues or Durable Objects for background work.

Can I use Node.js modules in Workers? Workers use the WinterCG subset of Web APIs. Node.js polyfills are available via nodejs_compat. Standard npm packages that depend on native modules (like sharp for image processing) don’t work — use Cloudflare Images or R2 for media.

How does Workers AI pricing work? Workers AI has per-token and per-request pricing depending on the model. Free tier includes limited inference. Check cloudflare.com/developers/documentation for current pricing.

What is compatibility_date? compatibility_date pins the runtime behavior to a specific date. Update it to get new features and fixes without changing your code. Always set it to today or recent.