Open-source multi-LLM chat client (LobeChat) plus hosted cloud with agents, plugins, and a skills marketplace.
One of the most polished open-source multi-LLM chat clients in 2026. Self-host edition is the right pick if you already pay per token and want an upgrade over provider-native UIs.
Last verified: April 2026
Sweet spot: a developer or technical power user who already pays per token across 2+ providers and wants a single, attractive UI they control. The one-click Vercel deploy is genuinely fast, and the skills marketplace gives you a low-effort way to try agent patterns without writing prompts from scratch. Failure modes. The cloud tier's credit model and invite-only access make it hard to recommend as a turnkey product right now — referral credits are nice in theory but messy in practice for budgeting. Self-hosting means you own the operational work: model-key rotation, vector DB if you enable RAG, deploy maintenance. If you're not comfortable redeploying when the project ships breaking updates, pick a stable hosted alternative. What to pilot. Deploy the self-hosted edition on Vercel with your existing OpenAI and Anthropic keys, install 2–3 marketplace skills, and use it as your daily driver for a week. If the UI replaces your habit of jumping between provider apps, the self-host is the right fit; if you find yourself reaching past it for ChatGPT or Claude.ai, you don't need a unified client and a single subscription is simpler.
LobeHub (formerly LobeChat) is an open-source AI productivity platform built around a polished multi-model chat UI. The self-hosted edition supports OpenAI, Anthropic, Gemini, DeepSeek, Qwen, Moonshot, Zhipu, Ollama, OpenRouter, and 30+ other providers — drop in your API keys and chat with any of them from the same interface. It runs as a Next.js app you deploy to Vercel, Docker, or your own infra in minutes. Beyond chat, LobeHub adds a plugin / function-calling system, a vision-and-voice-capable model interface, image generation via DALL-E and SD, custom agents with system prompts and knowledge bases, and a community marketplace of pre-built agent personas and "skills." A growing catalogue of agent skills lets users install task-specific assistants (coding reviewer, translation specialist, copywriter) without building them from scratch. The hosted LobeHub Cloud tier is currently invite-only with a credit-based pricing model — referrals award additional credits — and bundles all providers behind a single managed bill. For users tired of running their own Vercel deploy, the cloud is the easy button; for developers and privacy-focused users, the open-source self-host is the same product without the managed layer.
Self-host requires a working Vercel or Docker setup — not zero-effort for non-developers. The cloud tier is invite-only with a credit-based model that can be confusing to budget against. Skills marketplace quality is uneven (community-published, not curated). Multi-user / team governance is lighter than enterprise platforms.
No reviews yet. Be the first to share your experience.
Sign in to write a review
No questions yet. Ask something about LobeHub.
Sign in to ask a question
No discussions yet. Start a conversation about LobeHub.
Sign in to start a discussion