Self-hosted live chat you buy once

Own the software.
Own the conversation.

ZChat is self-hosted live chat with a native desktop agent, web dashboard, embeddable widget, and AI integrations - in one package you deploy on your own infrastructure.

One-time purchase Perpetual license No per-agent fees
ZChat admin dashboard

4

Products in one package - server, dashboard, widget, desktop agent

3

AI providers supported - Ollama, OpenAI, Anthropic

$0

Monthly per-agent fees - ever

Why teams move to ZChat

Most hosted chat tools sell access. ZChat sells software your team can actually own.

That matters when support data, brand control, infrastructure standards, and recurring cost all land on the same decision.

Own the conversation data

Keep transcripts and operational data in systems your team already audits instead of pushing customer conversations into another vendor silo.

Keep the stack practical

Install the package, configure the host, add the widget, and move forward without stitching together multiple overlapping products.

AI on your terms

Keep AI optional, bring your own provider, and stay in control of cost.

Run Ollama locally for private deployments or connect OpenAI and Anthropic using your own accounts and billing.

Typical flow

AI can greet, qualify, and draft.

When the conversation needs a person, ZChat hands it off with the context already attached.

Operator economics

$299

one-time purchase

Perpetual license, predictable cost, and no monthly seat tax when your team grows.

Native desktop agent

A real Windows client with presence, notifications, and a faster workflow for teams that live in support all day.

Real-time transport

Real-time delivery keeps typing, routing, alerts, and handoff feeling immediate for both agents and visitors.

Deployment flexibility

Use the hosting model your team already knows how to operate.

ZChat works in Windows-first environments, Linux-heavy teams, or Docker-friendly workflows without changing the commercial model.

IIS Nginx SQL Server Docker optional

Windows

A natural fit for Microsoft-centric teams

Linux

A lean host path for mixed environments

Cloud VM or on-prem

Deploy wherever governance, security, and operations already make sense.

appsettings.json
{
  "Chatbot": {
    "Provider": "Ollama",
    "OllamaUrl": "http://localhost:11434",
    "Model": "llama3"
  },
  "Realtime": {
    "Transport": "SignalR"
  }
}

// Or switch to your own cloud key
{
  "Chatbot": {
    "Provider": "OpenAI",
    "ApiKey": "sk-your-key-here",
    "Model": "gpt-4o"
  }
}
AI without lock-in

The support workflow stays the same even when your AI strategy changes.

Some teams need private local inference. Others already have cloud provider contracts. ZChat supports both without turning AI into another vendor lock-in decision.

Local-first when privacy matters

Use Ollama to keep prompts and responses inside your environment for regulated or privacy-sensitive support workflows.

Cloud-ready when you want frontier models

Bring your own OpenAI or Anthropic credentials and keep provider billing separate from your ZChat license.

Human handoff built into the journey

AI can greet, qualify, and draft while agents step in with the full conversation history already attached.

Operator workflow

Give support teams a workspace that feels like software, not a lightweight add-on.

Operators can work from a native Windows client or the browser dashboard while keeping alerts, visitor context, and routing decisions visible the whole time.

Native alerts and presence

System tray visibility and notifications keep busy teams from losing conversations inside a crowded browser.

Visitor context in real time

See page history, geography, and route status without breaking the conversation flow.

Fast handoff between AI and human

Move from automated response to a live operator without starting over or copying context around.

Browser access that still feels serious

The web console stays clean and capable on any device when the desktop client is not the right fit.

ZChat desktop agent application
Launch without a giant implementation detour

Move from evaluation to production without changing products.

Install the package, connect the database, configure teams and optional AI, then drop the widget into your website. The same core product goes from localhost test to production rollout.

Read the installation guide

Step 1

Install

Choose the host you want and bring the core package online quickly.

Step 2

Configure

Set teams, routing, availability, and the AI provider strategy that fits the environment.

Step 3

Embed

Add the widget to any website and start handling conversations in real time.

Built for teams with higher stakes

Teams usually switch when monthly fees, data exposure, or workflow limits stop feeling acceptable.

ZChat is a strong fit for organizations that want owned software, operational control, and a cleaner long-term cost model than hosted chat subscriptions.

"We wanted a chat platform we could actually own. ZChat gave us the control we needed without making the operator experience feel outdated."

Dave Frederiksen

Web Developer

"We needed self-hosted live chat for compliance reasons. ZChat gave us modern tooling without pushing our data into someone else's cloud."

Angela Danil

IT Director

"The one-time purchase model made the decision easier, and the product gave us a workflow we could shape around our own process."

Sara Lisbon

Designer

Owned customer support software

Deploy live chat on your own terms, not on someone else's pricing model.

ZChat gives you the installable server, web dashboard, website widget, and desktop agent tools in one self-hosted product you buy once and keep. Run it on infrastructure you trust and connect AI only if and how you want it.

Deployment

Install on Windows or Linux, behind IIS or Nginx, in a VM, or in Docker if that fits your stack.

Commercial model

One-time purchase, perpetual license, and no monthly per-agent bill attached to growth.

AI Flexibility

Use Ollama locally or connect OpenAI and Anthropic with your own provider accounts.