Skip to content

Quick Start

This guide gets you from a fresh build to your first chat. Ensure you’ve completed Installation. Use ./build/human if running from the project directory, or human if the binary is on your PATH.

Human reads config from ~/.human/config.json. Create the directory and file:

Terminal window
mkdir -p ~/.human

Run the onboard wizard to generate a config interactively:

Terminal window
human onboard

The wizard will:

  • Ask you to choose a provider (OpenAI, Anthropic, Ollama, OpenRouter)
  • Prompt for an API key (or you can set it via environment variable)
  • Ask for a default model (e.g. gpt-4o, claude-sonnet-4-20250514, llama3)
  • Create ~/.human/config.json and workspace templates

Create ~/.human/config.json with your chosen provider.

OpenAI (cloud, requires API key):

{
"default_provider": "openai",
"default_model": "gpt-4o",
"providers": [
{
"name": "openai",
"api_key": "sk-proj-your-key-here"
}
]
}

Ollama (local, no API key):

{
"default_provider": "ollama",
"default_model": "llama3",
"providers": [
{
"name": "ollama",
"base_url": "http://localhost:11434"
}
]
}

Ensure Ollama is running (ollama serve) and you’ve pulled a model (ollama pull llama3).

OpenRouter (single API for many models):

{
"default_provider": "openrouter",
"default_model": "anthropic/claude-sonnet-4",
"providers": [
{
"name": "openrouter",
"api_key": "sk-or-v1-your-key-here"
}
]
}

Run diagnostics to confirm your setup:

Terminal window
human doctor

Example output (local provider):

[doctor] config: ok (loaded from /Users/you/.human/config.json)
[doctor] provider (ollama): ok (local — no API key required)
[doctor] memory engine: markdown

Example output (cloud provider with API key):

[doctor] config: ok (loaded from /Users/you/.human/config.json)
[doctor] provider (openai): ok (API key configured)
[doctor] memory engine: sqlite

If you see warning — no API key, add the key to your provider config or set the appropriate environment variable (e.g. OPENAI_API_KEY).

Run a one-off message:

Terminal window
human agent -m "hello"

Example output:

> hello
Hello! How can I help you today?

Or start an interactive chat:

Terminal window
human agent

Type messages and press Enter. Use Ctrl+C to exit.

Use --provider and --model to switch without editing config:

Terminal window
human agent -m "What is 2 + 2?" --provider ollama --model mistral

If you want to receive webhooks (e.g. Telegram, Slack, Discord):

Terminal window
human gateway

By default it listens on 127.0.0.1:3000. Pair with a 6-digit code to get a bearer token for authenticated requests.

CommandDescription
human agentInteractive chat session
human agent -m "message"Single message
human agent --provider ollama --model llama3Override provider/model
human onboardFirst-run setup wizard
human doctorDiagnostics
human gatewayStart webhook server