Inquir Compute
Serverless runtimes

Serverless functions and runtimes

Run isolated serverless functions in real containers—same event model, layers, and hot pools across languages without splitting services across runtimes.

Same platform, any language

1

Pick a language

Choose Node.js 22, Python 3.12, or Go 1.22 when creating a function. Switch any time.

2

Write your code

Use any package manager, import any library. Full language support — no sandboxing restrictions.

3

Deploy

Same hot container model for all runtimes—benchmark warm-path latency on your own bundles; language choice alone does not guarantee a fixed millisecond budget.

The same handler in 3 languages

index.js
export async function handler(event, context) {
  const payload = typeof event.body === 'string' ? JSON.parse(event.body || '{}') : (event || {});
  const { name } = payload;

  // Any npm package works — fetch, axios, openai...
  const res = await fetch("https://api.example.com/greet", {
    method: "POST",
    body: JSON.stringify({ name }),
  });

  return {
    statusCode: 200,
    body: JSON.stringify({ message: `Hello, ${name || 'world'}!` }),
  };
}

Full runtime support

JS

Node.js 22

Full ESM, latest V8 engine, any npm package. The fastest way to ship a JS API.

Python 3.12

async/await natively, aiohttp, pip packages via layers. Great for AI and data workloads.

Go 1.22

Compiled binary performance in containers. Perfect for CPU-bound or latency-sensitive tasks.

Unified event contract

Same event, context, and response schema for all runtimes. Migrate between languages without changing callers.

Get started free

Deploy your first function in minutes. No credit card required.

Inquir Compute

The simplest way to run AI agents and backend jobs without infrastructure.

Contact info@inquir.org

© 2025 Inquir Compute. All rights reserved.