Inquir Compute logoInquir Compute
Node.js 22 · Inquir Compute

Node.js 22 serverless functions with native modules and warm pools

Node.js 22 serverless functions run in isolated containers—full npm ecosystem, native addons, and ES2024 features without V8 isolate restrictions. Wire functions to HTTP routes, cron schedules, webhook ingress, or background pipelines from one workspace.

Last updated: 2026-04-20

Direct answer

Node.js 22 serverless functions with native modules and warm pools. Inquir runs Node.js 22 in Docker containers—the same Node.js you use locally, with the same native module support. declare dependencies in package.json; the platform builds the container.

When it fits

  • Your handler uses native npm packages (sharp, bcrypt, grpc) that require a real Node.js container.
  • You want HTTP API endpoints, cron jobs, webhook handlers, and background jobs to share one Node.js codebase and one gateway.

Tradeoffs

  • Lambda gives you full Node.js but the operator cost of API Gateway + CloudWatch + IAM roles is non-trivial for small teams.
  • Edge runtimes are fast globally but you lose native modules and anything that needs a real POSIX environment.

Where edge runtimes limit Node.js

Edge serverless runtimes (Cloudflare Workers, Vercel Edge Functions) run a subset of Node.js inside V8 isolates. Native addons like sharp, bcrypt, and grpc-node, filesystem access, subprocess calls, and private network connections are all restricted or unavailable.

Lambda-style platforms like AWS Lambda support full Node.js but require assembling API Gateway, EventBridge, CloudWatch, and Layers yourself before the first function sees traffic.

Why Lambda and edge both miss something

Lambda gives you full Node.js but the operator cost of API Gateway + CloudWatch + IAM roles is non-trivial for small teams.

Edge runtimes are fast globally but you lose native modules and anything that needs a real POSIX environment.

Full Node.js 22 behind a single gateway

Inquir runs Node.js 22 in Docker containers—the same Node.js you use locally, with the same native module support. declare dependencies in package.json; the platform builds the container.

HTTP routes, cron triggers, webhook handlers, and background jobs all call the same Node.js function IDs. One deploy, one secrets model, one execution history—whether the caller is a user, a webhook provider, or a clock.

Node.js 22 runtime capabilities

Full npm ecosystem

sharp, bcrypt, better-sqlite3, grpc, canvas, and any native addon that compiles against Node.js 22 work without shimming.

ES2024 and top-level await

Use the latest ECMAScript features, ESM imports, and top-level await in .mjs handlers without transpilation.

Hot containers for steady traffic

Enable warm pools to pre-warm Node.js containers—cuts p99 latency on busy HTTP routes and agent tool loops.

Gateway + cron + jobs in one

The same function handles HTTP routes, scheduled pipeline triggers, and async background jobs—one codebase, three entry points.

How to deploy a Node.js serverless function

1

Write your handler

Export an async handler(event) function from a .mjs file. Use ESM imports. Event shape matches AWS Lambda HTTP events for compatibility.

2

Declare dependencies

Add npm packages to package.json. Native addons compile automatically during the container build—no Dockerfile required.

3

Wire routes, schedules, or jobs

Attach the function to a gateway HTTP route, a scheduled pipeline trigger, or a background job queue. Same handler, different invocation context.

HTTP handler with native module

The gateway passes an HTTP-shaped event: path, headers, body (string), queryStringParameters. Return statusCode + body. Native modules like sharp work because the runtime is a real container.

api/resize.mjs
import sharp from 'sharp'; // native module — works in containers, not in edge isolates

export async function handler(event) {
  const params = event.queryStringParameters ?? {};
  const width = Number(params.width ?? 320);
  const height = Number(params.height ?? 240);

  if (!event.body) return { statusCode: 400, body: JSON.stringify({ error: 'body required' }) };
  const input = Buffer.from(event.body, 'base64');
  const resized = await sharp(input).resize(width, height).jpeg({ quality: 85 }).toBuffer();

  return {
    statusCode: 200,
    headers: { 'Content-Type': 'image/jpeg' },
    body: resized.toString('base64'),
    isBase64Encoded: true,
  };
}

Best fit for Node.js 22 serverless functions

When this works

  • Your handler uses native npm packages (sharp, bcrypt, grpc) that require a real Node.js container.
  • You want HTTP API endpoints, cron jobs, webhook handlers, and background jobs to share one Node.js codebase and one gateway.

When to skip it

  • Tiny, pure-JS logic that benefits from running at hundreds of edge POPs for global millisecond latency—use a V8-isolate edge runtime for that.

FAQ

Which Node.js version does Inquir use?

Node.js 22 (current LTS). ESM (.mjs), CommonJS (.cjs), and TypeScript (compiled before deploy) are all supported.

Can I use ES modules (import/export)?

Yes. Name files .mjs or set "type": "module" in package.json. Top-level await works in ESM handlers.

How do I handle binary uploads through the gateway?

Binary bodies arrive as base64-encoded strings when the gateway receives binary content. Decode with Buffer.from(event.body, "base64") before processing.

Inquir Compute logoInquir Compute

The simplest way to run AI agents and backend jobs without infrastructure.

Contact info@inquir.org

© 2025 Inquir Compute. All rights reserved.