Self-Hosting Bitmagnet with Docker Compose
What Is Bitmagnet?
Bitmagnet is a self-hosted BitTorrent DHT crawler and search engine. It continuously discovers torrents from the distributed hash table network, classifies them by content type (movies, TV, music, software), and provides a Torznab-compatible API that integrates directly with Sonarr, Radarr, and other *arr stack applications. Think of it as your own private torrent index — no trackers, no accounts, no third-party dependency.
Updated March 2026: Verified with latest Docker images and configurations.
- Official site: bitmagnet.io
- Source code: github.com/bitmagnet-io/bitmagnet
- License: MIT
Prerequisites
- A Linux server (Ubuntu 22.04+ recommended)
- Docker and Docker Compose installed (guide)
- 2 GB of free RAM (minimum — 4 GB recommended for large indexes)
- 20 GB of free disk space (DHT metadata grows continuously)
- Port 3334 open for DHT communication (TCP and UDP)
Docker Compose Configuration
Create a docker-compose.yml file:
services:
bitmagnet:
image: ghcr.io/bitmagnet-io/bitmagnet:v0.10.0
container_name: bitmagnet
ports:
- "3333:3333" # Web UI and API
- "3334:3334/tcp" # BitTorrent DHT (TCP)
- "3334:3334/udp" # BitTorrent DHT (UDP)
environment:
- POSTGRES_HOST=db
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD}
- POSTGRES_DB=bitmagnet
depends_on:
db:
condition: service_healthy
command:
- worker
- run
- --keys=http_server
- --keys=queue_server
- --keys=dht_crawler
restart: unless-stopped
db:
image: postgres:16-alpine
container_name: bitmagnet-db
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: ${POSTGRES_PASSWORD}
POSTGRES_DB: bitmagnet
shm_size: 1g
volumes:
- bitmagnet-db:/var/lib/postgresql/data
restart: unless-stopped
healthcheck:
test: ["CMD-SHELL", "pg_isready -U postgres"]
interval: 10s
timeout: 5s
retries: 5
volumes:
bitmagnet-db:
Create a .env file alongside:
# PostgreSQL password — generate with: openssl rand -hex 16
POSTGRES_PASSWORD=change_me_to_a_strong_password
Start the stack:
docker compose up -d
Initial Setup
- Wait 30-60 seconds for the DHT crawler to initialize
- Open
http://your-server-ip:3333to access the web interface - The DHT crawler begins automatically — you’ll see torrents appearing within minutes
- The first hour typically discovers 10,000-50,000 torrents; after 24 hours, expect 500,000+
- No manual configuration is needed — Bitmagnet crawls the DHT network autonomously
Configuration
TMDB Integration (Recommended)
For automatic movie and TV show classification, add a TMDB API key. Get a free key at themoviedb.org:
environment:
- TMDB_API_KEY=your_tmdb_api_key_here
With TMDB enabled, Bitmagnet identifies movies and TV shows by name matching and enriches them with metadata (poster, year, rating, genre).
*arr Stack Integration
Bitmagnet exposes a Torznab-compatible API. Add it to Sonarr or Radarr:
- In Sonarr/Radarr, go to Settings → Indexers → Add → Torznab
- Set the URL to
http://bitmagnet:3333/torznab - No API key is required (unless you’ve configured authentication)
- Set categories as needed (Sonarr: 5000-5999 for TV, Radarr: 2000-2999 for Movies)
- Test and save
Crawler Tuning
Control how aggressively Bitmagnet crawls the DHT network. By default, it uses moderate settings. For faster discovery on powerful hardware:
environment:
- DHT_CRAWLER_SCALING_FACTOR=10
- DHT_CRAWLER_SAVE_FILES_THRESHOLD=200
Warning: Higher scaling factors increase CPU, memory, and database write load significantly. Start with defaults and increase gradually.
GraphQL API
Bitmagnet provides a GraphQL API at http://your-server:3333/graphql for programmatic access. Use the built-in GraphQL playground to explore available queries:
{
torrentContent(query: { queryString: "ubuntu" }) {
totalCount
items {
torrent {
name
size
filesCount
}
}
}
}
Reverse Proxy
If exposing Bitmagnet externally, place it behind a reverse proxy. With Nginx Proxy Manager, point your domain to bitmagnet:3333. Note that port 3334 (DHT) should NOT go through the reverse proxy — it needs direct UDP access.
See Reverse Proxy Setup for full configuration guides.
Backup
The PostgreSQL database contains all discovered torrent metadata:
# Create a database dump
docker compose exec db pg_dump -U postgres bitmagnet > bitmagnet-backup-$(date +%Y%m%d).sql
# Restore from backup
cat bitmagnet-backup.sql | docker compose exec -T db psql -U postgres bitmagnet
Note: the database can grow to tens of gigabytes after extended crawling. Compressed backups (pg_dump | gzip) are recommended. See Backup Strategy.
Troubleshooting
No torrents appearing after startup
Symptom: Web interface shows 0 torrents after 10+ minutes.
Fix: Ensure port 3334 is open for both TCP and UDP traffic. The DHT crawler needs bidirectional communication. Check your firewall: sudo ufw allow 3334/tcp && sudo ufw allow 3334/udp. Also verify the crawler is running in logs: docker compose logs bitmagnet | grep "dht".
Database growing too fast
Symptom: PostgreSQL data volume consuming 50+ GB within a week.
Fix: Bitmagnet stores metadata for every discovered torrent. If disk space is a concern, lower the scaling factor or run periodic cleanup. The project is actively adding retention policies — check the latest release notes.
High CPU usage from DHT crawler
Symptom: Container using 100%+ CPU continuously.
Fix: Reduce the scaling factor in your environment: DHT_CRAWLER_SCALING_FACTOR=1. The default is moderate, but on low-powered hardware (2 cores or less), even the default can be demanding.
PostgreSQL shared memory errors
Symptom: PostgreSQL crashes with could not resize shared memory segment errors.
Fix: Ensure shm_size: 1g is set on the PostgreSQL container. The default Docker shared memory (64 MB) is insufficient for Bitmagnet’s query patterns.
Frequently Asked Questions
Is Bitmagnet legal to run?
Bitmagnet crawls the DHT network and indexes torrent metadata (file names, sizes, hashes). It does not download or distribute copyrighted content. Running a DHT crawler is legal in most jurisdictions — it’s the same technology that every BitTorrent client uses to find peers. What you do with the indexed information determines legality.
How does Bitmagnet compare to Jackett or Prowlarr?
Jackett and Prowlarr proxy searches to external tracker sites — they depend on those sites being online and accessible. Bitmagnet builds its own local index by crawling the DHT network directly. No external dependencies, no accounts, no rate limits. The trade-off: Bitmagnet’s index takes time to build and focuses on DHT-discovered torrents, while Jackett/Prowlarr can search dozens of private trackers.
Does Bitmagnet work with Sonarr and Radarr?
Yes. Bitmagnet exposes a Torznab-compatible API endpoint that integrates directly with Sonarr, Radarr, Lidarr, and other *arr stack applications. Add it as an indexer like you would Jackett. The search API returns results from your local database.
How much disk space does Bitmagnet need long-term?
Plan for 50-100+ GB if running continuously. The DHT crawler discovers thousands of new torrents per hour, and all metadata is stored in PostgreSQL. Growth rate depends on your scaling factor setting. You can run periodic cleanup or set retention policies to cap storage.
Can I run Bitmagnet on a Raspberry Pi?
Not recommended. DHT crawling is CPU-intensive, and PostgreSQL needs at least 1 GB shared memory for Bitmagnet’s query patterns. A Raspberry Pi 4 with 8 GB RAM could work at a very low scaling factor, but performance will be poor. A mini PC or VPS with 4+ GB RAM is more practical.
Does Bitmagnet need port forwarding?
Yes. Port 3334 (TCP and UDP) must be accessible from the internet for the DHT crawler to work effectively. Without it, the crawler can still discover torrents but at a much slower rate since it can’t receive incoming DHT queries.
Resource Requirements
- RAM: 1 GB idle (bitmagnet) + 1 GB (PostgreSQL with 1 GB shared memory) = 2 GB minimum
- CPU: Medium-High — DHT crawling is CPU-intensive, especially at higher scaling factors
- Disk: 10 GB initially, growing 1-5 GB/day depending on crawl rate. Plan for 50+ GB long-term.
Verdict
Bitmagnet is the best self-hosted torrent indexer for *arr stack users who want to eliminate dependency on public tracker sites. The DHT crawler works autonomously — no accounts, no APIs, no tracker whitelists. The Torznab integration with Sonarr and Radarr is seamless. The main drawback is resource consumption: it needs a dedicated PostgreSQL instance with generous shared memory, and the database grows continuously. Run it on a machine with at least 4 GB RAM and 100+ GB disk if you plan to keep it running long-term. Still in alpha, so expect occasional breaking changes between versions.
Related
Get self-hosting tips in your inbox
Get the Docker Compose configs, hardware picks, and setup shortcuts we don't put in articles. Weekly. No spam.
Comments