← Back to all frameworks Azure ML & Cloud

Docker + Kubernetes

Containerized AI workers — portable, scalable, auditable

What it is

Docker for reproducible builds; Kubernetes for orchestration. Every Vaaani service ships as a container so it runs identically on the customer's cloud, on-prem, or my laptop.

How Vaaani uses it

  • Standardized deployment across AWS, Azure, GCP
  • Zero-downtime rollouts with health checks
  • Horizontal autoscaling based on queue depth or CPU
  • Secret management, config maps, namespaces for multi-tenancy

Why it makes the cut

Containers eliminate 'works on my machine.' K8s eliminates 'who restarts it at 3am?' Both are non-negotiable for production AI.

Sample code

# Dockerfile
FROM python:3.12-slim
COPY . /app
RUN pip install -r /app/requirements.txt
CMD ["uvicorn", "app:api", "--host", "0.0.0.0"]

Related in the Vaaani stack

Have a project that needs Docker?

30-min discovery call. You describe the busywork; I map it to an AI worker and a budget.