Newer
Older
cortex-hub / docs / refactors / skill_symlink_plan.md

Phase 4: Native Skill Symlinking

Observation

Currently, the LLM reads the contents of bash scripts from SKILL.md entirely into context, and executes them remotely using generic commands. This breaks down if a skill involves complex file directories (like 1000 lines of python configuration, custom shell scripts, etc.). Since Cortex utilizes a powerful bi-directional gRPC file synchronizer that maps a specific server-side Session directory (/tmp/cortex-sync/{session_id}/) directly down to the Client Node Workers, we can dynamically expose tools directly to the Node's active filesystem by symlinking the Skill's folder.

Objective

When a session is active, dynamically mount any active File-System Skills (e.g. weather_api) straight into the session workspace directory (.skills/). Because of the background mesh_file_explorer file syncing loop, any symlinked .skills folder on the Server will automatically be evaluated and populated down to the running Node Worker. This allows the AI to execute bash .skills/weather_api/run.sh natively without loading any code into its context!

Step-by-Step Implementation Plan

1. Identify Workspace Initialization Hook

The AI Server initializes the file sync workspace folder whenever a session starts or connects:

  • Files involved: app/ai-hub/app/core/services/session.py or wherever session folders are mapped (/tmp/cortex-sync/{session_id}).
  • Goal: During startup of the worker container node, or when the Agent loops starts, we must run a Symlink sync process.

2. Map the Linked Folders

  • The central DATA_DIR/skills/ directory holds all physical skills.
  • The Session Workspace directory is located at /tmp/cortex-sync/{session_id}/.
  • Inside the workspace, create a hidden orchestrator directory .skills.
  • Loop through all active tools loaded by ToolService.get_available_tools(...) for the given User/Session.
  • For every active tool found on the File System, create a relative symlink from DATA_DIR/skills/{feature}/{skill_id} to /tmp/cortex-sync/{session_id}/.skills/{skill_id}.

3. Automatically Ignore .skills in Git/History tracking (Optional)

  • Ensure the bi-directional sync does NOT push changes from .skills/ BACK up to the original physical folder if the AI ruins them. This is critical for security.
  • Wait, symlinks in Python (os.symlink) point to the read-only or original folder. If the Node modifies it, it modifies the original tool!
  • Alternative: Hard copy the scripts, OR use actual read-only Docker Volume mounts to the nodes (wait, the nodes are remote distributed workers!). If they are remote, the File Sync daemon using Python os.walk will follow symlinks and sync the physical files down to the remote Node.
  • The remote Node will treat them as raw downloaded files! It modifies its localized copies, not the Server's source!
  • However, if the Node tries to upload the changed files back, the Server's file_sync daemon will write the changes back to the symlink, modifying the global tool!
  • Mitigation: We must add .skills/ into the ignored_paths constant within app/core/grpc/shared_core/ignore.py ON THE UPSTREAM route (Node -> Server) so that changes aren't persisted backward.

4. Inject .skills/ Execution Logic into the System Prompt

  • For each injected skill, the Tool System currently parses Summary: ... or the | Parameters |.
  • Modify tool.py so that instead of saying "Call read_skill_artifact to see instructions", the system prompt explicitly tells the AI:
    This skill is natively mapped to your workspace. 
    You can execute it directly on the node via: `bash .skills/{skill_id}/{executable}`.

5. AI Self-Improvement/Evolution Capabilities

Since the .skills/ directory is bi-directionally synced between the true File System and the Node Workspace:

  1. The AI can natively use mesh_file_explorer to read any script located inside .skills/.
  2. The AI can use terminal sed, python ast or file tools to modify/debug its own skill source code automatically if it fails.
  3. Because the directory is physically synced, the server overwrites the permanent /app/data/skills/ folder seamlessly. The AI becomes capable of hot-fixing its own execution scripts permanently for all future sessions!

6. Finalizing Skill Definitions

  • Users can create a run.sh or main.py directly alongside SKILL.md in the VSCode IDE UI.
  • The AI LLM gets instructions to just call that file directly via mesh_terminal_control.

Summary

By symlinking DATA_DIR/skills/ -> /tmp/cortex-sync/{session_id}/.skills/, the gRPC network will sync the scripts as raw text files across the internet directly into the Client Node's OS container. The Agent gets zero context bloat, executing vast tools effortlessly, and gains the absolute power to spontaneously view, trace, and self-improve its own capabilities across sessions!