Flowise provides a drag-and-drop UI for building LLM workflows, RAG pipelines, and AI agents without code.
- Subdomain:
flow.<domain> - Container port: 3000
- Data: Docker volume
flowise-data - Docs: docs.flowiseai.com
In .env:
FLOWISE_VERSION=latest
FLOWISE_USERNAME=admin
FLOWISE_PASSWORD=<strong-password>
FLOWISE_SECRETKEY_OVERWRITE=<random-secret>Generate secrets:
openssl rand -hex 32 # for FLOWISE_SECRETKEY_OVERWRITE
openssl rand -base64 16 # for FLOWISE_PASSWORD- Open
https://flow.<domain>in browser - Login with
FLOWISE_USERNAME/FLOWISE_PASSWORDfrom.env - Start building flows
- In Flowise, add a ChatOllama node
- Set Base URL:
http://ollama-compose:11434(Docker network) - Select model:
llama3.2
Use
ollama-compose(container name), notlocalhost— Flowise runs inside Docker and needs the container network name.
- Add a Qdrant node
- Set URL:
http://qdrant-compose:6333 - Set API Key if configured
- Set Collection Name
- Document Loader → PDF, text, or web scraper
- Text Splitter → Recursive Character Text Splitter (chunk size: 1000)
- Embeddings → Ollama Embeddings (
nomic-embed-text) - Vector Store → Qdrant (
http://qdrant-compose:6333) - Chat Model → ChatOllama (
llama3.2) - Conversational Retrieval QA Chain → connects all nodes
# List chatflows
curl -H "Authorization: Bearer <api-key>" \
https://flow.<domain>/api/v1/chatflows
# Send message to chatflow
curl -X POST https://flow.<domain>/api/v1/prediction/<chatflow-id> \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <api-key>" \
-d '{"question": "What is Docker?"}'- Settings → API Keys → Add New
- Copy the key for external access
# Backup
docker compose stop flowise
docker run --rm -v ai-lab-server-setup_flowise-data:/data -v $(pwd):/backup \
ubuntu tar czf /backup/flowise-backup-$(date +%Y%m%d).tar.gz /data
docker compose start flowise
# Restore
docker compose stop flowise
docker run --rm -v ai-lab-server-setup_flowise-data:/data -v $(pwd):/backup \
ubuntu bash -c "rm -rf /data/* && tar xzf /backup/flowise-backup-YYYYMMDD.tar.gz -C /"
docker compose start flowise| Issue | Solution |
|---|---|
| Can't connect to Ollama | Use http://ollama-compose:11434, not localhost |
| Can't connect to Qdrant | Use http://qdrant-compose:6333, not localhost |
| 502 Bad Gateway | Check container: docker compose ps flowise |
| Login fails | Verify credentials in .env, restart: docker compose restart flowise |
| Slow responses | Normal for CPU-only Ollama — consider larger server |