Run Ollama with GPU on Google Colab, exposed via Cloudflare tunnel.
- Click badge above → opens Colab with Tesla T4 selected
- Run cell → wait for tunnel URL (e.g.
https://xxx.trycloudflare.com) - Copy URL to
test.pyor your.env
server.ipynb— Installs Ollama, starts GPU server, opens tunneltest.py— Client-side test script
# Run tests
uv run python src/modules/llm/colab/test.pyEdit MODEL in the Colab form sidebar (default: qwen3.5:9b)