System Architecture
Overview
Katachi employs a Hybrid Cloud-Local Architecture designed to balance the raw power of your local development environment with the convenience of cloud-based orchestration. Unlike cloud-only IDEs that detach you from your hardware, Katachi brings the intelligence to your machine.
Your Machine
Sentinel Agent
Your Machine
Sentinel Agent
Secure Tunnel
Zero Trust Pipe
Secure Tunnel
Zero Trust Pipe
Katachi Cloud
Orchestration Nexus
Katachi Cloud
Orchestration Nexus
Core Components
The system is composed of three primary pillars:
1. The Sentinel Agent (Local)
Running natively on your machine (macOS, Linux, or Windows), the Sentinel Agent is the hands and eyes of the system.
- Direct Execution: Executes commands directly on your shell after passing strict check.
- File Access: Reads and edits code locally. Your source code never leaves your machine for storage.
- Hardware Binding: Cryptographically bound to your specific device fingerprint, ensuring credentials cannot be stolen and reused elsewhere.
2. The Orchestration Nexus (Cloud)
Our cloud control plane manages the "state" of your session without holding your data.
- Session Management: Coordinates the connection between the web dashboard and your local agent.
- Command Relay: Securely routes user instructions to your agent.
- Identity & Billing: Handles secure authentication and subscription management via industry-standard providers.
3. The Secure Tunnel
Connecting the Local Agent and the Cloud Nexus is a Zero Trust encrypted tunnel.
- No Open Ports: You never need to open a port on your router or configure firewalls. The connection is outbound-only.
- End-to-End Encryption: Traffic flows through an encrypted channel with TLS 1.3 encryption.
- Ephemeral: Tunnels are created on-demand and destroyed when the session ends.
Data Flow
When you ask Katachi to "Refactor this file":
Request
You send a command via the Web Dashboard (e.g., "Refactor this file").
Transport
The command travels securely through the encrypted Tunnel to your Local Agent. No open ports required.
Processing
The Agent reads the local file context needed for the task using standard OS calls.
Reasoning
The specific context spans are sent ephemerally to the LLM. Your full codebase is never uploaded.
Action
The LLM returns a precise code diff or shell command to execute.
Execution
The Agent applies the changes locally and runs your linter/tests to verify correctness.
Feedback
Results (success/failure/logs) are streamed back through the Tunnel to your Dashboard.
Security by Design
This architecture ensures that Control remains in the cloud, but Execution and Data remain local.
- Attackers compromising the cloud cannot execute arbitrary commands without passing the local Agent's strict Sentinel checks.
- Your codebase is never uploaded to Katachi servers; only the snippets needed for the current task are processed ephemerally. You continue using your preferred Git provider (GitHub, GitLab, Bitbucket, etc.).