What Is Edge Computing?
Edge computing runs code in data centers geographically close to the end user — reducing latency by processing requests at the 'edge' of the network rather than in a central server.
Traditional servers run in one or a few locations. When a user in Tokyo hits a server in Virginia, every request crosses the planet — adding 150–300ms of latency before your code even starts running. Edge computing eliminates that by running your code in 100+ locations simultaneously.
How edge runtimes work: Platforms like Vercel Edge, Cloudflare Workers, and Deno Deploy deploy your code to a global network. Requests are routed to the nearest node. The code runs in a lightweight V8 isolate (not a full Node.js process) — startup time is near-zero.
Edge vs. serverless functions:
| Serverless | Edge | |
|---|---|---|
| Runtime | Node.js | V8 isolate (limited APIs) |
| Cold start | 100–500ms | <5ms |
| Location | Single region | Global PoPs |
| Node APIs | Full | Restricted |
Edge use cases:
- Middleware (auth checks, redirects, A/B tests) before the page renders
- Personalization based on geolocation
- API responses that must be fast globally
- Bot detection without round-tripping to a central server
Limitations: Edge runtimes don't support all Node.js APIs — no filesystem access, no native modules. Most database clients don't work at the edge (Postgres connections are expensive to initialize). Use HTTP-based APIs or connection poolers like Supabase's REST API.