Outclaw Documentation
Outclaw is an open-source AI marketing command center built on the MKT1 org structure. It deploys 21 specialized AI agents organized into three sub-functions — Product Marketing, Content & Brand, and Growth Marketing — coordinated through the GACCS brief framework and managed via a unified Command Center.
Quick Start
The fastest way to get Outclaw running is with Docker:
git clone https://github.com/outmarkhq/outclaw.git
cd outclaw
cp .env.example .env
# Edit .env with your LLM API keys and database credentials
docker compose up -d
Then open http://localhost:3000 in your browser. The setup wizard will guide you through workspace creation, LLM provider configuration, and agent provisioning.
Self-Hosted Requirements
Hardware
| Component | Minimum | Recommended |
|---|---|---|
| CPU | 2 cores | 4 cores |
| RAM | 4 GB | 8 GB |
| Storage | 20 GB SSD | 50 GB SSD |
Software
- Docker 20.10+ and Docker Compose v2 (recommended)
- Or: Node.js 20+, MySQL 8.0+ / TiDB, Redis (optional)
- Reverse proxy: Nginx, Caddy, or Traefik for SSL termination
Docker Deployment
Docker is the recommended deployment method. It bundles the application, database, and all dependencies into a single docker-compose.yml.
1. Clone the repository
git clone https://github.com/outmarkhq/outclaw.git
cd outclaw
2. Configure environment
cp .env.example .env
Edit .env with your configuration. At minimum, you need:
| Variable | Description | Required |
|---|---|---|
DATABASE_URL | MySQL connection string | Yes |
JWT_SECRET | Session signing secret (generate with openssl rand -hex 32) | Yes |
LLM_PROVIDER | Default LLM provider (openai, anthropic, google, openrouter) | Yes |
LLM_API_KEY | Your LLM provider API key | Yes |
APP_URL | Public URL of your Outclaw instance | Yes |
3. Start the services
docker compose up -d
4. Run database migrations
docker compose exec app pnpm db:push
5. Access the application
Open http://localhost:3000 (or your configured APP_URL). The first user to sign up becomes the workspace owner.
Linux (Manual Installation)
For production deployments without Docker, or when you need more control over the stack.
1. Install dependencies
# Node.js 20+ (via nvm)
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.0/install.sh | bash
nvm install 20
nvm use 20
# pnpm
npm install -g pnpm
# MySQL 8.0+
sudo apt install mysql-server
# AlphaClaw (orchestration engine)
pip install alphaclaw
# Browser Harness (agent browser access)
npx @anthropic-ai/browser-harness@latest install
2. Clone and install
git clone https://github.com/outmarkhq/outclaw.git
cd outclaw
pnpm install
3. Configure and start
cp .env.example .env
# Edit .env with your configuration
pnpm db:push
pnpm build
pnpm start
4. Set up reverse proxy
Configure Nginx or Caddy to proxy traffic to port 3000 with SSL termination. Example Nginx config:
server {
listen 443 ssl;
server_name command.yourdomain.com;
ssl_certificate /etc/letsencrypt/live/command.yourdomain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/command.yourdomain.com/privkey.pem;
location / {
proxy_pass http://127.0.0.1:3000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
Environment Variables
Complete reference of all environment variables:
| Variable | Description | Default |
|---|---|---|
DATABASE_URL | MySQL connection string | — |
JWT_SECRET | Session cookie signing secret | — |
APP_URL | Public URL of your instance | http://localhost:3000 |
LLM_PROVIDER | Default LLM provider | openai |
LLM_API_KEY | LLM provider API key | — |
LLM_MODEL | Default model name | Provider default |
REDIS_URL | Redis connection (optional, for caching) | — |
SMTP_HOST | SMTP server for email notifications | — |
SMTP_PORT | SMTP port | 587 |
SMTP_USER | SMTP username | — |
SMTP_PASS | SMTP password | — |
TELEGRAM_BOT_TOKEN | Telegram bot token (optional) | — |
SLACK_BOT_TOKEN | Slack bot token (optional) | — |
BROWSER_HARNESS_ENABLED | Enable browser access for agents | true |
Upgrading
Docker
cd outclaw
git pull origin main
docker compose pull
docker compose up -d
docker compose exec app pnpm db:push
Manual
cd outclaw
git pull origin main
pnpm install
pnpm db:push
pnpm build
# Restart your process manager (pm2, systemd, etc.)
mysqldump or your preferred backup tool before pulling new changes.
Cloud — Account Setup
The managed cloud version at command.outmarkhq.com handles infrastructure, updates, and scaling for you. Sign up, create a workspace, connect your LLM keys, and start submitting GACCS briefs.
- Go to command.outmarkhq.com and sign up
- Follow the onboarding wizard to create your workspace
- Connect your LLM provider (OpenAI, Anthropic, Google, or OpenRouter)
- Optionally connect messaging channels (Telegram, Slack, WhatsApp)
- Deploy your 21-agent team
Workspaces
Each workspace is an isolated environment with its own agents, knowledge base, channels, and settings. On the cloud version, each team gets their own workspace. On self-hosted, you have a single workspace.
Team Management
Invite team members via email from Settings → Team. Members can submit GACCS briefs and view task progress. Workspace owners can manage agent configuration, LLM keys, and channels.
Writing GACCS Briefs
GACCS is the structured brief format that Outclaw uses to route work to the right agents. Not all fields are required — but the structure forces clarity upfront.
Goals (G)
What business outcome are you targeting? Be specific about metrics, not deliverables. "Increase inbound demo requests by 25% in Q3" is a goal. "Write a blog post" is a deliverable.
Audience (A)
Who is this for? Reference ICP profiles, buyer personas, job titles, company stage, pain points, and where they consume content.
Creative (C)
What does the output look like? Specify format (blog, one-pager, ad copy), tone (technical, conversational), length, and any brand guidelines.
Channels (C)
Where does the output get distributed? Primary channel (e.g., LinkedIn) and secondary channels (e.g., email newsletter, company blog).
Stakeholders (S)
Who needs to review and approve before it goes live? Include names, roles, and approval criteria.
Agent Configuration
Each agent has a system prompt, assigned sub-function, tier (leader/coordinator/specialist), and model assignment. You can customize agent behavior from Settings → Agents in the Command Center.
Agents are provisioned automatically during workspace setup with sensible defaults based on the MKT1 org structure. Customize as needed for your specific use case.
Channels
Outclaw supports multiple input channels for submitting work to your agent team:
- Built-in GACCS form — always available at
/cc/new-request - Telegram — connect a bot to submit briefs via chat
- Slack — submit briefs from any Slack channel
- WhatsApp Business — submit briefs via WhatsApp
- API — programmatic access for custom integrations
LLM Providers
Outclaw supports multiple LLM providers. You bring your own API keys — we never store or proxy through our servers.
| Provider | Models | Best For |
|---|---|---|
| OpenAI | GPT-4o, GPT-4o-mini | General purpose, fast |
| Anthropic | Claude Sonnet 4, Claude Haiku | Long-form content, analysis |
| Gemini 2.5 Pro, Flash | Multimodal, large context | |
| OpenRouter | Any model | Model variety, fallbacks |
| Custom | Any OpenAI-compatible | Self-hosted models, Ollama |
Enterprise Edition
Outclaw uses a dual-license model. The core is MIT licensed — free to use, modify, and distribute. Enterprise features are available under a proprietary license for teams that need additional capabilities:
- SSO / SAML authentication — connect to your identity provider
- Audit logs — complete trail of all agent actions and user activity
- SLA policies — define response time targets for different brief priorities
- Advanced analytics — cost tracking, agent performance metrics, ROI attribution
- Priority support — direct access to the engineering team
- Custom branding — white-label the Command Center with your brand
Contact [email protected] for enterprise licensing.