Docker Deployment
The recommended way to run Nibchat in production.
Prerequisites
- Docker Engine 24+
- Docker Compose v2+
Quick start
git clone https://github.com/nibchat-ai/nibchat.git
cd nibchat
cp server/.env.example server/.env
# Edit server/.env with your settings
docker compose up -d
Environment file
# Required
OPENAI_API_KEY=sk-...
# Optional — lock model for all users
OPENAI_MODEL=gpt-4o
# Optional — use a different OpenAI-compatible API
OPENAI_BASE_URL=https://api.openai.com/v1
# Optional — web tools
TAVILY_API_KEY=tvly-...
JINA_API_KEY=jina_...
# Required for OAuth MCP servers
PUBLIC_URL=https://chat.example.com
Data persistence
Nibchat stores everything in a single SQLite file mounted at /app/data/nibchat.db inside the container. The docker-compose.yml maps it to ./data/nibchat.db on the host.
volumes:
- ./data:/app/data
Back up this file to preserve all conversations, memories, and admin accounts.
Reverse proxy (nginx)
Example nginx config for chat.example.com:
server {
listen 443 ssl;
server_name chat.example.com;
ssl_certificate /etc/ssl/certs/chat.example.com.crt;
ssl_certificate_key /etc/ssl/private/chat.example.com.key;
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
# Required for SSE streaming
proxy_set_header Connection '';
proxy_buffering off;
proxy_cache off;
chunked_transfer_encoding on;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
SSE buffering
The proxy_buffering off directive is critical. Without it, nginx will buffer the SSE stream and chat responses will appear all at once instead of streaming word by word.
Updating
git pull
docker compose build
docker compose up -d
Database migrations run automatically on startup — no manual steps needed.