Quick Start
This guide gets you from a fresh build to your first chat. Ensure you’ve completed Installation. Use ./build/human if running from the project directory, or human if the binary is on your PATH.
1. Create your config
Section titled “1. Create your config”Human reads config from ~/.human/config.json. Create the directory and file:
mkdir -p ~/.humanOption A: Interactive wizard
Section titled “Option A: Interactive wizard”Run the onboard wizard to generate a config interactively:
human onboardThe wizard will:
- Ask you to choose a provider (OpenAI, Anthropic, Ollama, OpenRouter)
- Prompt for an API key (or you can set it via environment variable)
- Ask for a default model (e.g.
gpt-4o,claude-sonnet-4-20250514,llama3) - Create
~/.human/config.jsonand workspace templates
Option B: Manual config
Section titled “Option B: Manual config”Create ~/.human/config.json with your chosen provider.
OpenAI (cloud, requires API key):
{ "default_provider": "openai", "default_model": "gpt-4o", "providers": [ { "name": "openai", "api_key": "sk-proj-your-key-here" } ]}Ollama (local, no API key):
{ "default_provider": "ollama", "default_model": "llama3", "providers": [ { "name": "ollama", "base_url": "http://localhost:11434" } ]}Ensure Ollama is running (ollama serve) and you’ve pulled a model (ollama pull llama3).
OpenRouter (single API for many models):
{ "default_provider": "openrouter", "default_model": "anthropic/claude-sonnet-4", "providers": [ { "name": "openrouter", "api_key": "sk-or-v1-your-key-here" } ]}2. Verify with doctor
Section titled “2. Verify with doctor”Run diagnostics to confirm your setup:
human doctorExample output (local provider):
[doctor] config: ok (loaded from /Users/you/.human/config.json)[doctor] provider (ollama): ok (local — no API key required)[doctor] memory engine: markdownExample output (cloud provider with API key):
[doctor] config: ok (loaded from /Users/you/.human/config.json)[doctor] provider (openai): ok (API key configured)[doctor] memory engine: sqliteIf you see warning — no API key, add the key to your provider config or set the appropriate environment variable (e.g. OPENAI_API_KEY).
3. Send your first message
Section titled “3. Send your first message”Run a one-off message:
human agent -m "hello"Example output:
> hello
Hello! How can I help you today?Or start an interactive chat:
human agentType messages and press Enter. Use Ctrl+C to exit.
4. Override provider or model
Section titled “4. Override provider or model”Use --provider and --model to switch without editing config:
human agent -m "What is 2 + 2?" --provider ollama --model mistral5. Start the gateway (optional)
Section titled “5. Start the gateway (optional)”If you want to receive webhooks (e.g. Telegram, Slack, Discord):
human gatewayBy default it listens on 127.0.0.1:3000. Pair with a 6-digit code to get a bearer token for authenticated requests.
CLI reference
Section titled “CLI reference”| Command | Description |
|---|---|
human agent | Interactive chat session |
human agent -m "message" | Single message |
human agent --provider ollama --model llama3 | Override provider/model |
human onboard | First-run setup wizard |
human doctor | Diagnostics |
human gateway | Start webhook server |