Skip to content

mscbuild/localchat

Repository files navigation

LocalChat 🌐

A private web interface for local LLMs. Works with Ollama, LM Studio, and any OpenAI-compatible API.

✨ Peculiarities

  • 🔒 Complete privacy—everything works locally
  • 🎨 Dark/light theme (automatic by system)
  • 📝 Markdown and code highlighting support
  • 🕒 Chat history (in localStorage)
  • 🧠 Select a model from the list (Ollama)
  • 📱 Responsive design

🚀 Installation

Requirements

Local launch

git clone https://github.com/mscbuild/localchat.git
cd localchat
npm install
npm run dev

Open

Docker

docker build -t localchat .
docker run -p 3000:3000 --network host localchat

⚙️ Setting up

By default, it connects to:

  • API: http://localhost:11434 (Ollama)
  • Model: llama3

You can change it in the interface (drop-down list of models) or through environment variables:

OLLAMA_BASE_URL=http://192.168.1.100:11434 npm run dev

✅ What's next?

Run in terminal:

git init
git add .
git commit -m "feat: initial commit"
git branch -M main
git remote add origin https://github.com/mscbuild/localchat.git
git push -u origin main

📄 MIT License

About

֎ Simple, privacy-first web UI for local LLMs (Ollama, LM Studio, OpenAI-compatible APIs)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published