Serverless functions are the core unit of compute. Each function is an isolated handler that receives an event and context and returns a result—whether you invoke from the editor, async jobs, or HTTP routes on the gateway.
Run in the test panel sends your JSON as the event object directly (no API Gateway wrapper). HTTP routes instead pass an API Gateway–like event with a string body. Handlers often branch on typeof event.body === 'string' (Node) or equivalent so both paths work.
How to create functions
- From the UI — Click + New Function on the Functions page. Optionally upload a
.zipwith your code. - Drag & drop — Drop files or folders onto the file tree in the editor sidebar.
Function configuration
| Setting | Default | Description |
|---|---|---|
| Handler | index.handler | Entry point as file.export |
| Runtime | nodejs22 | Node.js 22, Python 3.12, or Go 1.22 |
| Timeout | 5 000 ms | Max execution time before the function is killed |
| Memory | 256 MB | Memory limit for the container |
| Network | — | Enable to allow outbound HTTP/TCP from the function |
| Layers | — | Optional list of catalog layer ids attached to the function (see the Layers doc). |
| Runtime variant (Node.js) | lite | <code>lite</code> (default) or <code>full</code> — selects the Node container image; use <code>full</code> when you need the heavier browser-capable stack. |
Hot containers / hot functions
Click Deploy to pre-warm containers. Hot lambdas keep an HTTP server running inside the container so subsequent invocations skip the cold-start penalty (~5–50 ms vs 500–2 000 ms).
Containers above the per-function minimum are recycled after 5 minutes idle. The last remaining container is stopped after 10 minutes idle. Both thresholds are configurable via HOT_LAMBDA_TTL_MS and HOT_LAMBDA_MAX_IDLE_TTL_MS.
The server must run with ENABLE_HOT_LAMBDAS=true; otherwise Deploy reports that hot lambdas are not enabled.
Test panel
The right-hand panel in the editor has tabs for running and configuring the function:
- Run — synchronous invoke with your JSON as
event. Enable Stream when you want a Server-Sent Events stream from supported runtimes. - Config — handler, timeout, network, Node runtime variant, layers, and environment variables.
- Traces — recent runs for this function (same trace model as the Executions page).
- AI — in-editor assistant when your deployment enables it.