Skip to main content

Docker Compose

The fastest way to get started with tvarr.

Image Tags

TagDescription
:releaseLatest stable release (recommended)
:latestMost recent build (includes main branch and releases)
:previewLatest main branch build
:X.Y.ZSpecific version (e.g., 1.0.0)

Standalone (Single Container)

For most home users, the standalone all-in-one image is recommended:

docker-compose.yml
services:
tvarr:
image: ghcr.io/jmylchreest/tvarr:release
restart: unless-stopped
ports:
- "8080:8080"
environment:
- PUID=1000
- PGID=1000
- TZ=UTC
- TVARR_SERVER_BASE_URL=http://your-host:8080
volumes:
- tvarr-data:/data
# Intel/AMD GPU (VAAPI) - uncomment for hardware transcoding
# devices:
# - /dev/dri:/dev/dri

volumes:
tvarr-data:
docker compose up -d

Access the web UI at http://localhost:8080.

Distributed Mode (Coordinator + Workers)

For larger setups or dedicated transcoding hardware:

docker-compose.yml
services:
tvarr:
image: ghcr.io/jmylchreest/tvarr-coordinator:release
restart: unless-stopped
ports:
- "8080:8080"
environment:
- PUID=1000
- PGID=1000
- TZ=UTC
- TVARR_GRPC_ENABLED=true
volumes:
- tvarr-data:/data

ffmpegd:
image: ghcr.io/jmylchreest/tvarr-transcoder:release
restart: unless-stopped
depends_on:
tvarr:
condition: service_healthy
environment:
- TVARR_COORDINATOR_URL=tvarr:9090
- TVARR_MAX_JOBS=4
# Intel/AMD GPU
devices:
- /dev/dri:/dev/dri

volumes:
tvarr-data:

NVIDIA GPU Support

Add NVIDIA runtime to your transcoder service:

services:
ffmpegd:
image: ghcr.io/jmylchreest/tvarr-transcoder:release
deploy:
resources:
reservations:
devices:
- driver: nvidia
count: all
capabilities: [gpu, video, compute]

Environment Variables

VariableDefaultDescription
PUID1000User ID for file permissions
PGID1000Group ID for file permissions
TZUTCTimezone
TVARR_SERVER_BASE_URL-External URL for generated playlists
TVARR_LOGGING_LEVELinfoLog level (debug, info, warn, error)
TVARR_DATABASE_DSN/data/tvarr.dbDatabase path

See Configuration for all options.