Serverless functions and runtimes
Run isolated serverless functions in real containers—same event model, layers, and hot pools across languages without splitting services across runtimes.
How it works
Same platform, any language
Pick a language
Choose Node.js 22, Python 3.12, or Go 1.22 when creating a function. Switch any time.
Write your code
Use any package manager, import any library. Full language support — no sandboxing restrictions.
Deploy
Same hot container model for all runtimes—benchmark warm-path latency on your own bundles; language choice alone does not guarantee a fixed millisecond budget.
Example
The same handler in 3 languages
export async function handler(event, context) { const payload = typeof event.body === 'string' ? JSON.parse(event.body || '{}') : (event || {}); const { name } = payload; // Any npm package works — fetch, axios, openai... const res = await fetch("https://api.example.com/greet", { method: "POST", body: JSON.stringify({ name }), }); return { statusCode: 200, body: JSON.stringify({ message: `Hello, ${name || 'world'}!` }), }; }
Features
Full runtime support
Node.js 22
Full ESM, latest V8 engine, any npm package. The fastest way to ship a JS API.
Python 3.12
async/await natively, aiohttp, pip packages via layers. Great for AI and data workloads.
Go 1.22
Compiled binary performance in containers. Perfect for CPU-bound or latency-sensitive tasks.
Unified event contract
Same event, context, and response schema for all runtimes. Migrate between languages without changing callers.
Get started free
Deploy your first function in minutes. No credit card required.