Installation
Install the pydefend package from PyPI and prepare your environment for running the Defend API locally or in Docker.
Defend requires Python 3.12+. The published package name is pydefend; the CLI entry point is defend.
The default PyPI install is intentionally slim: no PyTorch or Hugging Face transformers. That is enough for claude / openai as provider (normalization, regex, then LLM modules). Heuristic intent fast-pass (L2) runs only when provider is defend. For the local classifier, install the optional extra below.
Local Defend classifier (optional)
If defend.config.yaml sets provider: defend, install ML dependencies:
pip install pydefend[local]This pulls PyTorch, transformers, and numpy so the Hugging Face model configured by DEFEND_MODEL_ID can load. For smaller CPU-only wheels in your own images, install torch from the PyTorch CPU index before or after pydefend[local] so pip does not pull CUDA-only packages.
Docker
Use image adxzer/defend:local when you run with defend as the provider in the container. The default adxzer/defend:latest image matches the slim install (no local classifier). See Docker.
Create configuration
Defend expects defend.config.yaml in the current working directory when you run defend serve. Generate it with an init token from pydefend.com:
defend init --token "defend_v1_..."Alternatively, start from the minimal example on the Quick start page.
Optional API keys
If you use claude or openai for evaluation or output guarding, set the corresponding provider key in your environment. See Environment variables.
Docker
For container deployment, use the published images and mount your config. Full commands and platform notes are in the Docker guide.