companion v2.0-beta

Your codebase &
meetings, unified.

Companion merges your development environment, meeting transcripts, and desktop tools into a single, intelligent agent. Code faster, with full context.

companion — ~/workspace
companion init --with-vexa-voice

Connect your codebase to reality.

Companion isn't just an LLM wrapper. It's a suite of targeted agents that unify your tools, repositories, and conversations.

OpenCode uses context from previous commits, READMEs, and your colleagues' pull requests to write intelligent, robust code.

bash
companion ask "Update the Auth Service"
Planning
Analyzing src/auth.ts and src/routes.ts...
Found recent meeting: "Auth Service Refactor" (2 days ago)
Requirement: Use JSON Web Tokens (JWT) instead of sessions.
✔ Wrote 140 lines. Updated tests. Ready to commit.
Architecture

Built for DevelopersResearchers.
Powered by context.

Companion is an intelligent orchestrator bridging your codebase and team communication.

OpenCode Engine
A fast, terminal-integrated coding agent that understands your repository structure, writes features, and debugs errors.
Vexa Voice Bots
Connect Companion to your Google Meet or Zoom. Vexa transcribes in real-time and surfaces action items into your workspace.
MCP Server
A flexible Model Context Protocol layer to safely expose internal enterprise tools to your AI coding agents.
Organizational Context
Transcripts, readmes, and codebase vectors are automatically indexed into a shared, searchable knowledge graph.
Desktop Automation
Automate tedious local workflows. Ask Companion to run test suites, manage docker containers, or start build scripts.
Local & Private
Run your agent directly on your machine. Companion defaults to on-premise embeddings, keeping code and meetings secure.
Workflow

Agentic workflow,
simplified.

1. Connect Context

Initialize Companion in your repository. It automatically parses your READMEs, issues, and syncs with your Vexa meeting bots.

$ companion init --all
2. Assign Tasks

Assign complex coding tasks or ask architectural questions directly from your terminal or IDE.

$ companion assign 'Fix auth bug'
3. Review & Ship

Companion writes code, runs your local test suite, and prepares the commit. You review the diff and ship.

$ companion review && git push
Security

Air-gapped by default.

Companion is built for organizations with sensitive intellectual property. Every component can run entirely on your local machine or private cloud—from transcription and embedding generation to AI inference.

Compatible with Ollama, vLLM, and any OpenAI-compatible endpoint. Bring your own models.

Architecture Config

~/.companion/config.yml

storage: local

Vector DB and transcripts never leave disk

inference: ollama

100% local weights and computation

telemetry: false

No phone home. Zero data sharing.

Start Building

Ready to ship faster?

Install the Companion CLI globally to start indexing your codebase and attending your meetings.

curl -fsSL https://companion.ky0.dev/install.bash | bash
Requires macOS or Linux and an active API key.