Chapter 1: What is Worka?
TL;DR
- Host: Worka is a local-first Rust application that functions as a secure host for sandboxed AI Packs. It provides core services like a UI shell, a database, process management, and a secure communication bridge.
- Packs: All features are delivered as Packs. A pack is a bundle containing UI views, backend logic (MCP servers), and a manifest (
aip.json
) declaring its contents and required permissions. - Orchestrator: The AI engine is a DAG-based worker that executes workflows. It uses LLMs to generate plans (as graphs of tool calls) and then executes them, enabling robust, stateful automation.
Welcome to the foundational concept of the Worka platform. Before we write any code, it's essential to understand the architecture and the roles of its different components.
Think of the Worka application like an operating system (like Windows or macOS), and think of AI Packs as the applications you install on it (like Word or Photoshop). The OS provides the core services, security, and the environment to run applications, while the applications themselves provide the actual features you interact with.
Worka is composed of three main conceptual layers:
1. The Host Application
The Host is the core Worka desktop application you install on your computer. It is a native application built with Tauri, which means it has a powerful Rust backend and a fast, modern frontend. The Host is responsible for:
- Providing the UI Shell: It renders the main window, the sidebar, the tab management, and all the core interface elements.
- Security: It acts as a secure sandbox for all AI Packs. A pack cannot access your filesystem or the network unless it explicitly declares that it needs to and you grant it permission.
- Core Services: It provides essential services that all packs can securely access, such as a database for storage and a way to manage processes.
- Local-First: All of your data, pack source code, and credentials are encrypted and stored locally on your machine. Nothing is sent to the cloud unless you explicitly configure a pack to do so.
2. The AI Pack Ecosystem
All functionality in Worka is delivered through AI Packs. A pack is a self-contained, installable bundle that can include everything from UI components (written in React) to backend logic, AI agents, and tools. Because packs are sandboxed, they are a safe and modular way to extend the capabilities of Worka.
3. The Orchestrator
The Orchestrator is the brain of the Worka platform. When you give a task to an AI Agent within a pack, the Orchestrator is what takes over. It's responsible for:
- Planning: It works with the AI to translate a high-level goal into a concrete, step-by-step plan. This plan is structured as a Directed Acyclic Graph (DAG).
- Execution: It executes the DAG, running each step in the correct order and ensuring that dependencies are met.
- Coordination: It coordinates the use of different AI Agents and the Tools they have access to, ensuring the right component is used for the right job.
This model allows Worka to perform complex, multi-step tasks that go far beyond the capabilities of a simple chatbot.