← all SaaS
Self-host ChatGPT
AI chat assistant (consumer UI) ·
Category: AI / LLM tooling
ChatGPT is OpenAI's hosted chat product — a web/mobile UI on top of GPT-4o with conversation history, file uploads, image generation, and tools. The self-hostable replacements pair an open-weight model (via Ollama or your favorite inference server) with a polished chat UI that matches the conversation/history/file-attach surface most users actually interact with.
ChatGPT pricing anchor: ChatGPT Plus $20/user/mo; Team $25/user/mo with shared workspace.
- GitHub
- ★ 136.4k · last commit today · 233 open issues
- License
-
BSD-3-Clause - Setup time
- 10min docker-compose (Open WebUI + Ollama)
- Monthly cost
- $10 VPS for the UI; the model server adds the real cost — $0 on a workstation GPU, $200+/mo on cloud GPU.
Migration sketch. Run `docker compose up` with the bundled Ollama+Open WebUI compose file from the project README. First-load creates the admin account; pull a model from inside the UI (Models → Pull → `llama3.1:8b`). ChatGPT history doesn't export cleanly — paste threads into a markdown file and use the Knowledge feature to make them searchable inside Open WebUI conversations.
Good fit forTeams that want a ChatGPT-shaped multi-user web app with RBAC, prompt templates, and a built-in RAG knowledge base.
Weak atImage generation and voice mode — these need bolt-on integrations (e.g. Automatic1111 for SD) rather than working out of the box.
- GitHub
- ★ 36.8k · last commit today · 422 open issues
- License
-
MIT - Setup time
- 20min docker-compose (LibreChat + Mongo + Meilisearch)
- Monthly cost
- $10-15 VPS for the app stack; same model-server caveat as Open WebUI.
Migration sketch. Clone the repo, copy `.env.example` to `.env`, set `OPENAI_API_KEY` (or point `OPENAI_REVERSE_PROXY` at a local Ollama / LiteLLM URL), then `docker compose up -d`. LibreChat speaks to OpenAI, Anthropic, Google, and any OpenAI-compatible local server side by side; users pick the model per conversation. ChatGPT export.zip can be partially imported via the conversation import endpoint (one JSON per thread).
Good fit forPower users who want one chat UI in front of many providers — switch models mid-conversation, fork threads, plugin support.
Weak atHeavier stack (Mongo + Meilisearch + RAG service) — overkill for single-user installs.
- GitHub
- ★ 42.4k · last commit 3d ago · 398 open issues
- License
-
AGPL-3.0 - Setup time
- 5min installer (desktop app)
- Monthly cost
- Free; it's a local desktop app — no server to host. Cost is the workstation that runs the model.
Migration sketch. Download the installer for macOS/Windows/Linux from jan.ai, launch, and pull a model from the Hub tab. Conversations live in `~/jan/threads/` as JSON; ChatGPT export.zip can be dropped into a folder and queried via the assistant feature with the local files as context.
Good fit forSingle users who want an Electron desktop ChatGPT replacement with no server, no Docker, no account.
Weak atMulti-user / team scenarios — Jan is fundamentally local-first; for shared use deploy Open WebUI or LibreChat instead.
In a terminal? npx os-alt chatgpt prints this table —
how the CLI works →