diff --git a/README.md b/README.md index cd9ed64..6adb19a 100644 --- a/README.md +++ b/README.md @@ -1,136 +1,94 @@ -# ๐Ÿง  Cortex Hub: AI Model Hub Service +# ๐Ÿง  Cortex Hub: Autonomous AI Agent Mesh & Orchestrator -[](https://www.python.org/downloads/release/python-3110/) -[](https://github.com/psf/black) -[](https://fastapi.tiangolo.com/) +Cortex Hub is a state-of-the-art, modular AI orchestration platform that bridges the gap between Large Language Models (LLMs) and local execution via a distributed **Agent Node Mesh**. It features a modern React-based workspace, a powerful **Skill System**, and advanced **RAG** (Retrieval-Augmented Generation) capabilities. -Cortex Hub is a modular and scalable API service designed to act as a central gateway to various Large Language Models (LLMs). It features a stateful, session-based chat system with conversational memory, powered by a Retrieval-Augmented Generation (RAG) pipeline for grounding responses in your own data. +## โœจ Key Features -## โœจ Features +* **๐ŸŒ Distributed Agent Mesh**: Connect multiple local or remote nodes (Linux, macOS, Windows) to your Hub. Each node can execute tasks, manage files, and provide terminal access. +* **๐Ÿ› ๏ธ Extensible Skill System**: Orchestrate AI capabilities via "Skills" (Browser Automation, Terminal Control, File Management). Dynamic permissioning allows granular control over which users or groups can access specific nodes and skills. +* **๐Ÿ“‚ Private RAG Pipeline**: Securely ingest documents into a FAISS vector store to ground AI responses in factual, local data. +* **๐Ÿ” Industrial-Grade Security**: Integrated with OIDC (OpenID Connect) for secure user authentication and Role-Based Access Control (RBAC). +- **๐Ÿ–ฅ๏ธ Unified Command Center**: A sleek, premium React frontend for managing sessions, configuring nodes, and monitoring the swarm in real-time. -* **Conversational Memory**: Engages in stateful conversations using a session-based API. The AI remembers previous messages to provide contextual follow-up answers. -* **Retrieval-Augmented Generation (RAG)**: Ingest documents into a high-speed FAISS vector store to ground the AI's responses in factual, user-provided data. -* **Multi-Provider Support**: Easily integrates with multiple LLM providers (currently DeepSeek and Gemini). -* **Full Document Lifecycle**: Complete API for adding, listing, and deleting documents from the knowledge base. -* **Modern Tech Stack**: Built with FastAPI, Pydantic, SQLAlchemy, and DSPy for a robust, type-safe, and high-performance backend. -* **Containerized**: Fully containerized with Docker and Docker Compose for easy setup and deployment. +--- ------ +## ๐Ÿ’ฌ How to Use -## ๐Ÿš€ Getting Started +### 1. Connecting Nodes +Download the **Agent Node Bundle** from the "Nodes" page in the dashboard. Unzip and run `./run.sh` (Linux/macOS) or `run.bat` (Windows). The node will automatically connect to your Hub using the pre-configured security token. -You can run the entire application stack (API server and database) using Docker Compose. +### 2. Using Skills +Skills like **Browser Automation** and **Terminal Control** are available directly in your chat sessions. You can attach specific nodes to a session to give the AI hands-on access to those environments. + +--- + +## ๐Ÿš€ Quick Start (Local Development) + +The easiest way to get started is using the pre-configured **Dev Container** or local Docker Compose. ### Prerequisites +* **Docker** & **Docker Compose** +* **Python 3.11+** (if running without Docker) +* **Node.js 18+** (for frontend development) -* **Docker** and **Docker Compose** -* **Python 3.11+** (for local development) -* An API key for at least one supported LLM (DeepSeek, Gemini). - -### 1\. Configuration - -The application is configured using a `.env` file for secrets and a `config.yaml` for non-sensitive settings. - -First, copy the example environment file: - +### 1. Configure Secrets +Copy the example environment file and add your API keys (Gemini, OpenAI, etc.): ```bash cp .env.example .env ``` -Now, open the `.env` file and add your secret API keys: - -``` -# .env -DEEPSEEK_API_KEY="your_deepseek_api_key_here" -GEMINI_API_KEY="your_gemini_api_key_here" -``` - -*(You only need to provide a key for the model you intend to use.)* - -### 2\. Running with Docker Compose (Recommended) - -This is the simplest way to get the service running. - +### 2. Launch the Stack +Initialize the Hub (Backend, Frontend, and Database) in one command: ```bash -docker-compose up --build +docker compose up -d --build ``` +* **Frontend**: [http://localhost:8002](http://localhost:8002) +* **API Docs**: [http://localhost:8000/docs](http://localhost:8000/docs) -The API server will be available at `http://127.0.0.1:8000`. +--- -### 3\. Running Locally (Alternative) +## ๐Ÿ—๏ธ Deployment Architecture -If you prefer to run without Docker: +Cortex Hub uses a layered deployment strategy to keep the core codebase clean while supporting specific production environments. +### ๐Ÿ“‚ Folder Structure +* **`ai-hub/`**: The Core Python (FastAPI) backend. +* **`ui/client-app/`**: The React-based unified dashboard. +* **`agent-node/`**: The lightweight client software for distributed nodes. +* **`skills/`**: Source for AI capabilities (Shell, Browser, etc.). +* **`deployment/`**: Environment-specific overrides (e.g., `jerxie-prod` with NFS support). +* **`scripts/`**: Centralized automation for sync, rebuilding, and maintenance. + +### ๐Ÿšข Production Deployment +For Jerxie AI production instances, we use the centralized remote deployer: ```bash -# Install dependencies -pip install -r requirements.txt - -# Run the server -uvicorn app.main:app --host 127.0.0.1 --port 8000 --reload +# Sync local changes and rebuild on the production server +REMOTE_PASS='' bash scripts/remote_deploy.sh ``` ------ +--- -## ๐Ÿ’ฌ Usage +## ๐Ÿ›๏ธ Project Layout -### Interactive Chat Script - -The easiest way to interact with the service is by using the provided chat script. It handles starting the server, creating a session, and managing the conversation. - -In your terminal, simply run: - -```bash -bash run_chat.sh -``` - -You will be prompted to enter your questions in a continuous loop. Type `exit` to end the session and shut down the server. - -### API Documentation - -Once the server is running, interactive API documentation (powered by Swagger UI) is automatically available at: - -* **[http://127.0.0.1:8000/docs](https://www.google.com/search?q=http://127.0.0.1:8000/docs)** - -## From this page, you can explore and execute all API endpoints directly from your browser - -## ๐Ÿงช Running Tests - -The project includes a comprehensive test suite using `pytest`. - -### Unit Tests - -These tests cover individual components in isolation and use mocks for external services and the database. - -```bash -pytest tests/ -``` - -### Integration Tests - -These tests run against a live instance of the server to verify the end-to-end functionality of the API. The script handles starting and stopping the server for you. - -```bash -bash run_integration_tests.sh -``` - ------ - -## ๐Ÿ›๏ธ Project Structure - -The project follows a standard, scalable structure for modern Python applications. - -``` +```text . -โ”œโ”€โ”€ app/ # Main application package -โ”‚ย ย  โ”œโ”€โ”€ api/ # API layer: routes, schemas, dependencies -โ”‚ย ย  โ”œโ”€โ”€ core/ # Core business logic: services, pipelines, providers -โ”‚ย ย  โ”œโ”€โ”€ db/ # Database layer: models, session management -โ”‚ย ย  โ”œโ”€โ”€ app.py # FastAPI application factory -โ”‚ย ย  โ””โ”€โ”€ main.py # Application entry point -โ”œโ”€โ”€ config.yaml # Default configuration -โ”œโ”€โ”€ data/ # Persistent data (SQLite DB, FAISS index) -โ”œโ”€โ”€ integration_tests/ # End-to-end tests -โ”œโ”€โ”€ tests/ # Unit tests -โ”œโ”€โ”€ Dockerfile -โ””โ”€โ”€ docker-compose.yml +โ”œโ”€โ”€ ai-hub/ # Backend API & Orchestrator +โ”œโ”€โ”€ ui/ # Frontend Workspace (React) +โ”œโ”€โ”€ agent-node/ # Distributed Node Client +โ”œโ”€โ”€ skills/ # AI Skill Definitions +โ”œโ”€โ”€ deployment/ # Env Overrides (NFS, SSL, OIDC) +โ”œโ”€โ”€ scripts/ # CI/CD & Maintenance Scripts +โ”œโ”€โ”€ cortex.db # Local SQLite Cache +โ””โ”€โ”€ docker-compose.yml # Generic Development Entrypoint ``` + +## ๐Ÿงช Testing + +* **Backend**: `pytest ai-hub/tests/` +* **Frontend Health**: `scripts/frontend_tester` +* **Connectivity**: `scripts/test_ws.js` + +--- + +## โš–๏ธ License +Distributed under the MIT License. See `LICENSE` for more information.