Python 3.12 serverless functions for APIs, cron jobs, and ML tools
Python 3.12 serverless functions run in isolated containers—numpy, pandas, scikit-learn, and any PyPI package work out of the box. Wire functions to HTTP routes for AI agent tool backends, cron schedules for nightly ETL, or background pipelines for ML inference steps.
Last updated: 2026-04-20
Answer first
Direct answer
Python 3.12 serverless functions for APIs, cron jobs, and ML tools. Inquir runs Python 3.12 in Docker containers. List dependencies in requirements.txt; the platform builds the image. numpy, pandas, scikit-learn, transformers, and any C-extension library compile and run exactly as they do locally.
When it fits
- You need numpy, pandas, scikit-learn, or any C-extension ML library in a serverless runtime.
- You are building AI agent tool backends in Python that need HTTP auth, cron schedules, and async pipelines in one platform.
Tradeoffs
- Managing Lambda Layers for numpy, pandas, and ML libraries adds packaging overhead—the platform-imposed 250 MB unzipped limit means careful pruning of every release.
- Stitching API Gateway + Lambda + CloudWatch + EventBridge for a Python-based AI agent tool backend is non-trivial ceremony for a team that just needs authenticated HTTP endpoints.
Workload and what breaks
Where edge runtimes and managed services limit Python
Edge runtimes do not support Python at all. Lambda and GCP Cloud Functions support Python but restrict cold-start performance with large dependency trees—numpy, pandas, and scikit-learn can push cold starts into seconds without careful layer management.
AI teams frequently need Python for tool backends: call OpenAI, run scikit-learn inference, process dataframes, or invoke Hugging Face pipelines—all of which need a real Python 3.12 container with full PyPI support.
Trade-offs
Why Lambda-only Python has friction
Managing Lambda Layers for numpy, pandas, and ML libraries adds packaging overhead—the platform-imposed 250 MB unzipped limit means careful pruning of every release.
Stitching API Gateway + Lambda + CloudWatch + EventBridge for a Python-based AI agent tool backend is non-trivial ceremony for a team that just needs authenticated HTTP endpoints.
How Inquir helps
Full Python 3.12 behind one gateway
Inquir runs Python 3.12 in Docker containers. List dependencies in requirements.txt; the platform builds the image. numpy, pandas, scikit-learn, transformers, and any C-extension library compile and run exactly as they do locally.
The same Python function can serve as an AI agent tool endpoint (HTTP route with API key auth), a nightly ETL job (scheduled pipeline), or an LLM pipeline stage (background step)—one handler, three invocation paths, one observability story.
What you get
Python 3.12 runtime capabilities
Full PyPI ecosystem
numpy, pandas, scikit-learn, transformers, Pillow, httpx, psycopg2, and any C-extension package that compiles on Python 3.12.
AI agent tool backends
Expose Python ML functions as authenticated HTTP endpoints—call OpenAI, run inference, query dataframes, and return structured JSON to agent orchestrators.
Nightly ETL and data processing
Schedule Python functions as cron jobs: read from APIs, transform with pandas, write to databases—with run history and retries visible in the console.
LLM pipeline stages
Use Python steps inside multi-stage pipelines for retrieval, embedding generation, moderation, or summarization—share secrets and layers across stages.
What to do next
How to deploy a Python 3.12 serverless function
Write your handler
Define def handler(event, context) in a .py file. event.body is a string—parse with json.loads when needed.
Declare dependencies
Add packages to requirements.txt. The platform pip-installs them into the container image—no manual layer packaging.
Attach to gateway, schedule, or pipeline
Wire the function to an HTTP route, a scheduled pipeline trigger, or a background job step—same handler regardless of invocation path.
Code example
Python handler: AI agent tool and cron ETL
event.body arrives as a JSON string—parse with json.loads. Return {"statusCode": ..., "body": json.dumps(...)} for HTTP route responses. Secrets are injected as environment variables.
import json import os from openai import OpenAI client = OpenAI(api_key=os.environ["OPENAI_API_KEY"]) # injected from workspace secrets def handler(event, context): body = json.loads(event.get("body") or "{}") text = body.get("text") if not text: return {"statusCode": 400, "body": json.dumps({"error": "text required"})} response = client.chat.completions.create( model="gpt-4o-mini", messages=[{"role": "user", "content": f"Classify the sentiment of: {text}"}], ) return {"statusCode": 200, "body": json.dumps({"sentiment": response.choices[0].message.content})}
import json import os import pandas as pd import psycopg2 def handler(event, context): conn = psycopg2.connect(os.environ["DATABASE_URL"]) df = pd.read_sql("SELECT * FROM events WHERE processed = false LIMIT 1000", conn) # transform and upsert results = [] for _, row in df.iterrows(): result = process_row(row) results.append(result) conn.close() return {"processed": len(results)}
When it fits
Best fit for Python 3.12 serverless functions
When this works
- You need numpy, pandas, scikit-learn, or any C-extension ML library in a serverless runtime.
- You are building AI agent tool backends in Python that need HTTP auth, cron schedules, and async pipelines in one platform.
When to skip it
- Your Python logic is tiny and you only need global edge delivery—Python is not available on most edge isolate platforms.
FAQ
FAQ
Which Python version does Inquir use?
Python 3.12. Standard library, asyncio, and all synchronous patterns are supported. Use async def handler for async handlers.
Can I use pandas and numpy without hitting size limits?
Yes. Containers have no 250 MB unzipped limit like Lambda Layers. Declare numpy and pandas in requirements.txt; the platform builds the image without size-based pruning.
How do I access environment variables and secrets?
Bind secrets in the Inquir console; they are injected as environment variables at runtime. Access with os.environ["SECRET_NAME"] — never hardcode credentials in your Python handler.
Can Python functions call private databases?
Yes. Container-backed functions can connect to VPC databases, Redis clusters, and internal services that edge functions cannot reach.