What teams build with Inquir Compute
From agents and webhooks to nightly jobs—each guide ties a pattern to gateway routes, pipeline schedules, the job queue, and per-function containers.
Situation
Documentation lists features; teams need patterns
Feature lists rarely answer the messy questions: where to acknowledge a webhook before heavy work runs, how to give an LLM a tool without leaking secrets into the prompt, or how a nightly job relates to the same code you expose as an API.
These guides connect each pattern to how Inquir actually behaves—public traffic through the gateway, recurring runs through scheduled pipelines, background load through the job queue—so you are not reverse-engineering the product from scattered pages.
Playbook gaps
Copy-pasting playbooks from other vendors breaks
IAM assumptions, edge CPU caps, and per-invocation pricing on someone else’s cloud do not transfer line for line to Inquir’s gateway, containers, and pipeline model.
The safer move is to pilot one use case in a workspace, read the logs and invocation records, then add the next integration once the first path feels boring.
How Inquir fits
One platform for HTTP, time-driven work, and async queues
Functions are Lambda-compatible (Node 22, Python 3.12, Go 1.22) and run in isolated containers. The same function can back a public gateway route and a step inside a scheduled pipeline; async jobs add another entry point with the same logging and isolation story—so you are not maintaining three deploy paths for one service.
Layers and optional warm containers help equally whether traffic is a burst of model tool calls or a steady cron-driven sync. Tune cost and latency against real traffic, not guesses.
Capabilities
Guides (each maps to how Inquir is meant to be used)
AI agent backends
Give each tool its own function behind the gateway, lock routes down with API keys, keep secrets in workspace configuration, and lean on warm pools when the model hammers tools in a tight loop.
Cron-style schedules
Model recurring work as a pipeline with a schedule trigger. Expressions are validated when you save; the scheduler wakes pipelines on a fixed cadence, using the same containers and history as HTTP traffic.
Webhook processors
Give each provider a dedicated route, verify signatures on the raw body, return success within vendor timeouts, and hand long work to a pipeline or job when the HTTP window is too small.
Background jobs
Use the job queue when callers should get an immediate response, or chain steps inside a pipeline when you need explicit stages and retries.
REST APIs
Slice routes per function or route group, configure CORS and rate limits at the gateway, and keep authentication consistent across public surfaces.
LLM pipelines
Break retrieval, moderation, tools, and summarization into functions you can test independently; share utilities through layers and trace each stage.
Steps
How to work through these use-case guides
Map each guide to gateway routes, pipelines, schedules, and jobs in Inquir Compute.
Match the sprint
Pick the guide that mirrors what you need to ship in the next iteration.
Create or reuse a workspace
Isolation, API keys, and routes are scoped per workspace—reuse when it keeps noise down.
Deploy and watch runs
Lean on execution history and logs before you bet on high volume.
Cross-links
Open a use-case guide
Each article goes deep on one pattern and links back to comparisons and feature pages. Together they should read like a cookbook for the same product, not six unrelated products.
Fit
When these guides help
When to use
- You are onboarding engineers to Inquir or planning a new route, schedule, or integration.
- You want vocabulary that matches the product—workspaces, functions, gateway, pipelines—so docs and UI stay aligned.
When not to use
- You need contractual or compliance wording. Treat these guides as technical narrative and run legal review separately.
FAQ
FAQ
Do I need separate products for schedules and APIs?
No. One deployment gives you gateway routes and pipeline triggers, schedules included. The guides split topics so each page stays readable.
Where do I start if I am new?
Create a workspace from the home page, deploy a template such as the AI summarizer, hit the gateway URL once, then open the guide that matches your next integration.
How do guides relate to docs.inquir.org?
Guides tell the story; documentation lists limits, CLI flags, and configuration fields you will rely on in production.