Inquir Compute logoInquir Compute
Python 3.12 · Inquir Compute

Python 3.12 serverless functions for APIs, cron jobs, and ML tools

Python 3.12 serverless functions run in isolated containers—numpy, pandas, scikit-learn, and any PyPI package work out of the box. Wire functions to HTTP routes for AI agent tool backends, cron schedules for nightly ETL, or background pipelines for ML inference steps.

Last updated: 2026-04-20

Direct answer

Python 3.12 serverless functions for APIs, cron jobs, and ML tools. Inquir runs Python 3.12 in Docker containers. List dependencies in requirements.txt; the platform builds the image. numpy, pandas, scikit-learn, transformers, and any C-extension library compile and run exactly as they do locally.

When it fits

  • You need numpy, pandas, scikit-learn, or any C-extension ML library in a serverless runtime.
  • You are building AI agent tool backends in Python that need HTTP auth, cron schedules, and async pipelines in one platform.

Tradeoffs

  • Managing Lambda Layers for numpy, pandas, and ML libraries adds packaging overhead—the platform-imposed 250 MB unzipped limit means careful pruning of every release.
  • Stitching API Gateway + Lambda + CloudWatch + EventBridge for a Python-based AI agent tool backend is non-trivial ceremony for a team that just needs authenticated HTTP endpoints.

Where edge runtimes and managed services limit Python

Edge runtimes do not support Python at all. Lambda and GCP Cloud Functions support Python but restrict cold-start performance with large dependency trees—numpy, pandas, and scikit-learn can push cold starts into seconds without careful layer management.

AI teams frequently need Python for tool backends: call OpenAI, run scikit-learn inference, process dataframes, or invoke Hugging Face pipelines—all of which need a real Python 3.12 container with full PyPI support.

Why Lambda-only Python has friction

Managing Lambda Layers for numpy, pandas, and ML libraries adds packaging overhead—the platform-imposed 250 MB unzipped limit means careful pruning of every release.

Stitching API Gateway + Lambda + CloudWatch + EventBridge for a Python-based AI agent tool backend is non-trivial ceremony for a team that just needs authenticated HTTP endpoints.

Full Python 3.12 behind one gateway

Inquir runs Python 3.12 in Docker containers. List dependencies in requirements.txt; the platform builds the image. numpy, pandas, scikit-learn, transformers, and any C-extension library compile and run exactly as they do locally.

The same Python function can serve as an AI agent tool endpoint (HTTP route with API key auth), a nightly ETL job (scheduled pipeline), or an LLM pipeline stage (background step)—one handler, three invocation paths, one observability story.

Python 3.12 runtime capabilities

Full PyPI ecosystem

numpy, pandas, scikit-learn, transformers, Pillow, httpx, psycopg2, and any C-extension package that compiles on Python 3.12.

AI agent tool backends

Expose Python ML functions as authenticated HTTP endpoints—call OpenAI, run inference, query dataframes, and return structured JSON to agent orchestrators.

Nightly ETL and data processing

Schedule Python functions as cron jobs: read from APIs, transform with pandas, write to databases—with run history and retries visible in the console.

LLM pipeline stages

Use Python steps inside multi-stage pipelines for retrieval, embedding generation, moderation, or summarization—share secrets and layers across stages.

How to deploy a Python 3.12 serverless function

1

Write your handler

Define def handler(event, context) in a .py file. event.body is a string—parse with json.loads when needed.

2

Declare dependencies

Add packages to requirements.txt. The platform pip-installs them into the container image—no manual layer packaging.

3

Attach to gateway, schedule, or pipeline

Wire the function to an HTTP route, a scheduled pipeline trigger, or a background job step—same handler regardless of invocation path.

Python handler: AI agent tool and cron ETL

event.body arrives as a JSON string—parse with json.loads. Return {"statusCode": ..., "body": json.dumps(...)} for HTTP route responses. Secrets are injected as environment variables.

tools/classify.py (AI agent tool endpoint)
import json
import os
from openai import OpenAI

client = OpenAI(api_key=os.environ["OPENAI_API_KEY"])  # injected from workspace secrets

def handler(event, context):
    body = json.loads(event.get("body") or "{}")
    text = body.get("text")
    if not text:
        return {"statusCode": 400, "body": json.dumps({"error": "text required"})}
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": f"Classify the sentiment of: {text}"}],
    )
    return {"statusCode": 200, "body": json.dumps({"sentiment": response.choices[0].message.content})}
jobs/nightly-sync.py (scheduled cron pipeline)
import json
import os
import pandas as pd
import psycopg2

def handler(event, context):
    conn = psycopg2.connect(os.environ["DATABASE_URL"])
    df = pd.read_sql("SELECT * FROM events WHERE processed = false LIMIT 1000", conn)
    # transform and upsert
    results = []
    for _, row in df.iterrows():
        result = process_row(row)
        results.append(result)
    conn.close()
    return {"processed": len(results)}

Best fit for Python 3.12 serverless functions

When this works

  • You need numpy, pandas, scikit-learn, or any C-extension ML library in a serverless runtime.
  • You are building AI agent tool backends in Python that need HTTP auth, cron schedules, and async pipelines in one platform.

When to skip it

  • Your Python logic is tiny and you only need global edge delivery—Python is not available on most edge isolate platforms.

FAQ

Which Python version does Inquir use?

Python 3.12. Standard library, asyncio, and all synchronous patterns are supported. Use async def handler for async handlers.

Can I use pandas and numpy without hitting size limits?

Yes. Containers have no 250 MB unzipped limit like Lambda Layers. Declare numpy and pandas in requirements.txt; the platform builds the image without size-based pruning.

How do I access environment variables and secrets?

Bind secrets in the Inquir console; they are injected as environment variables at runtime. Access with os.environ["SECRET_NAME"] — never hardcode credentials in your Python handler.

Can Python functions call private databases?

Yes. Container-backed functions can connect to VPC databases, Redis clusters, and internal services that edge functions cannot reach.

Inquir Compute logoInquir Compute

The simplest way to run AI agents and backend jobs without infrastructure.

Contact info@inquir.org

© 2025 Inquir Compute. All rights reserved.