The Cortex Swarm features a robust bidirectional file synchronization engine ("Ghost Mirror") built over the bidirectional gRPC task stream.
core/sync.py, core/watcher.py)watchdog to monitor its local workspace folder. When a user or command modifies a file, it chunks the file (64KB chunks) into FilePayload Protobufs and streams them to the Hub.SyncControl gRPC commands:
START_WATCHING, STOP_WATCHING, LOCK (blocks local edits).LIST, READ, WRITE, DELETE.core/grpc/core/mirror.py, assistant.py)FilePayload chunks and maintains an exact, real-time replica of the workspace on the Hub's local disk at /app/data/mirrors/{session_id}.AssistantService exposes .ls(), .cat(), .write(), and .rm() which send SyncControl commands over the network to the Node and wait for the result.mesh_file_explorer skill uses these AssistantService functions, meaning every time the AI reads a file, it does a full network round-trip to the node, waiting for the node to read it and send it back.The file sync infrastructure (Ghost Mirror) is incredibly powerful for the AI. Because the sync engine guarantees eventual consistency across assigned nodes and the central Hub mirror, it natively unlocks several advanced Swarm workflows:
When the AI is tasked with refactoring an entire codebase, it can use the mesh_file_explorer to apply massive multi-file changes entirely on the Hub's local mirrored repository.
Ghost Mirror automatically syncs these changes out to the specific Edge Node.execute skill on that remote edge node (e.g., npm run test). The code is already there. If it fails, the AI iterates locally on the hub.A single synced directory can act as a live shared memory state across multiple disjointed agent nodes that are otherwise unaware of each other.
session_id workspace.shared/data.json. The Hub's Ghost Mirror immediately replicates that change down to Node B. Node B's crawler detects the new file and begins processing it.Though the AI can technically access files using the mesh_file_explorer skill, the current implementation is inefficient. If the AI is debugging a file in a synced workspace (session_id), it is wasting time doing network round-trips to the Agent Node for ls and cat commands.
Since the Hub already maintains a perfect local Ghost Mirror (/app/data/mirrors), the AI should be querying the local Hub disk instantly.
Goal: Reduce AI file exploration latency from ~1000ms (network) to ~1ms (local disk).
Implementation Blueprint (For AI reading this document):
/app/ai-hub/app/core/grpc/services/assistant.py where ls and cat are defined.ls and cat if session_id != "__fs_explorer__".
__fs_explorer__ session is a special session used by the UI to browse arbitrarily anywhere on the machine. This must still use gRPC.session_id means the Hub has a mirror at /app/data/mirrors/{session_id}.ls:
os.scandir or os.walk on the local Hub path (/app/data/mirrors/{session_id}/{path}).{"path": path, "files": [{"name": "file.txt", "size": 123, "is_dir": False}, ...]}.self.journal.register(...) and node.queue.put(...) entirely.cat:
open(path, 'r').read() on the local Hub mirror.{"path": path, "content": text}.write/rm:
os and shutil tools.node.queue.put(FileSyncMessage(SyncControl.WRITE)) line, but make it "fire and forget" or await it concurrently, returning Success to the AI instantly.Goal: Empower the Swarm AI to autonomously manage replication and locks across nodes.
Implementation Blueprint (For AI reading this document):
/app/ai-hub/app/core/skills/definitions/mesh_sync_control.json and map it in /app/ai-hub/app/core/services/tool.py.start_sync(node_id: str, path: str): Sends SyncControl.START_WATCHING via AssistantService to instruct a new node edge to hook into the mesh.lock_node(node_id: str): Sends SyncControl.LOCK to prevent a human dev from altering files while the SubAgent is running multi-file edits.resync_node(node_id: str): Sends SyncControl.RESYNC to force the node to hash-check itself against the master mirror to fix desync errors naturally.Goal: Allow the AI to act as the ultimate "git merge" authority over the distributed filesystem.
Implementation Blueprint (For AI reading this document):
/app/ai-hub/app/core/grpc/services/assistant.py or the main task stream router, intercept SyncStatus.RECONCILE_REQUIRED events.Observation event directly into the SubAgent's RagPipeline queue.
/src/lib.js."inspect_drift(node_id, file_path) skill which gives a unified diff of what the Hub thinks the file looks like vs. what the Node actually has, empowering the AI to issue the decisive write.