Newer
Older
cortex-hub / ai-hub / uvicorn.log
nohup: ignoring input
WARNING:app.app:Failed to initialize TTS/STT: 'NoneType' object has no attribute 'split'
INFO:app.core.tools.registry:Registered dynamic tool plugin: 'browser_automation_agent'
INFO:app.core.tools.registry:Registered dynamic tool plugin: 'mesh_file_explorer'
INFO:app.core.tools.registry:Registered dynamic tool plugin: 'mesh_inspect_drift'
INFO:app.core.tools.registry:Registered dynamic tool plugin: 'mesh_sync_control'
INFO:app.core.tools.registry:Registered dynamic tool plugin: 'mesh_terminal_control'
INFO:app.core.tools.registry:Registered dynamic tool plugin: 'mesh_wait_tasks'
INFO:app.core.tools.registry:Registered dynamic tool plugin: 'read_skill_artifact'
INFO:     Started server process [14535]
INFO:     Waiting for application startup.
INFO:app.db.migrate:Starting database migrations...
INFO:app.db.migrate:Column 'audio_path' already exists in 'messages'.
INFO:app.db.migrate:Column 'model_response_time' already exists in 'messages'.
INFO:app.db.migrate:Column 'token_count' already exists in 'messages'.
INFO:app.db.migrate:Column 'reasoning_content' already exists in 'messages'.
INFO:app.db.migrate:Column 'stt_provider_name' already exists in 'sessions'.
INFO:app.db.migrate:Column 'tts_provider_name' already exists in 'sessions'.
INFO:app.db.migrate:Column 'sync_workspace_id' already exists in 'sessions'.
INFO:app.db.migrate:Column 'attached_node_ids' already exists in 'sessions'.
INFO:app.db.migrate:Column 'node_sync_status' already exists in 'sessions'.
INFO:app.db.migrate:Column 'sync_config' already exists in 'sessions'.
INFO:app.db.migrate:Column 'is_cancelled' already exists in 'sessions'.
INFO:app.db.migrate:Column 'restrict_skills' already exists in 'sessions'.
INFO:app.db.migrate:Column 'allowed_skill_names' already exists in 'sessions'.
INFO:app.db.migrate:Column 'system_prompt_override' already exists in 'sessions'.
INFO:app.db.migrate:Column 'is_locked' already exists in 'sessions'.
INFO:app.db.migrate:Database migrations complete.
INFO:app.core.services.node_registry:[NodeRegistry] Reset all DB node statuses to 'offline'.
INFO:app.core.grpc.services.grpc_server:๐Ÿš€ CORTEX gRPC Orchestrator starting on [::]:50051
INFO:app.app:[M6] Agent Orchestrator gRPC server started on port 50051.
INFO:app.core.orchestration.scheduler:[Scheduler] Agent background services (Zombie Sweeper & CRON) started.
INFO:app.core.skills.bootstrap:Checking for system skills bootstrapping...
INFO:app.core.skills.bootstrap:System skills bootstrap completed.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:8002 (Press CTRL+C to quit)
โœ… Loading configuration from app/config.yaml
Application startup...
--- โš™๏ธ  Application Configuration ---
  - ACTIVE_LLM_PROVIDER: gemini
  - ALLOW_OIDC_LOGIN: True
  - ALLOW_PASSWORD_LOGIN: ***
  - DATABASE_URL: sqlite:///./test.db
  - DATA_DIR: /app/data
  - DB_MODE: sqlite
  - DEEPSEEK_API_KEY: sk-a...6bf2
  - DEEPSEEK_MODEL_NAME: deepseek-chat
  - EMBEDDING_API_KEY: AIza...sKuI
  - EMBEDDING_DIMENSION: 768
  - EMBEDDING_MODEL_NAME: models/text-embedding-004
  - EMBEDDING_PROVIDER: google_gemini
  - FAISS_INDEX_PATH: data/faiss_index.bin
  - GEMINI_API_KEY: AIza...sKuI
  - GEMINI_MODEL_NAME: gemini/gemini-3-flash-preview
  - GRPC_CERT_PATH: None
  - GRPC_EXTERNAL_ENDPOINT: None
  - GRPC_KEY_PATH: Not Set
  - GRPC_TARGET_ORIGIN: None
  - GRPC_TLS_ENABLED: False
  - LLM_PROVIDERS: {'gemini': {'api_key': 'AIzaSyBn5HYiZ8yKmNL0ambyz4Aspr5lKw1sKuI', 'model': 'gemini/gemini-3-flash-preview'}, 'deepseek': {'api_key': 'sk-a1b3b85a32a942c3b80e06566ef46bf2'}, 'openai': {'api_key': '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'}}
  - LOG_LEVEL: DEBUG
  - OIDC_CLIENT_ID: cortex-server
  - OIDC_CLIENT_SECRET: aYc2...leZI
  - OIDC_ENABLED: False
  - OIDC_REDIRECT_URI: http://localhost:8001/users/login/callback
  - OIDC_SERVER_URL: https://auth.jerxie.com
  - OPENAI_API_KEY: sk-p...h34A
  - PROJECT_NAME: Cortex Hub
  - SECRET_KEY: aYc2...leZI
  - SKILLS_DIR: /app/data/skills
  - STT_API_KEY: AIza...sKuI
  - STT_MODEL_NAME: None
  - STT_PROVIDER: google_gemini
  - STT_PROVIDERS: {}
  - SUPER_ADMINS: ['axieyangb@gmail.com']
  - TTS_API_KEY: AIza...sKuI
  - TTS_MODEL_NAME: None
  - TTS_PROVIDER: google_gemini
  - TTS_PROVIDERS: {}
  - TTS_VOICE_NAME: Kore
  - VERSION: 1.0.0
------------------------------------
Creating database tables...
INFO:     127.0.0.1:54482 - "POST /api/v1/users/login/local HTTP/1.1" 200 OK
INFO:app.core.services.preference:Saving updated global preferences via admin 1df35bf4-2eec-414a-982d-280a6bd73be4
๐Ÿ  Configuration synchronized to app/config.yaml
INFO:     127.0.0.1:54482 - "PUT /api/v1/users/me/config HTTP/1.1" 200 OK
INFO:     127.0.0.1:54482 - "POST /api/v1/users/admin/groups HTTP/1.1" 409 Conflict
INFO:     127.0.0.1:54482 - "GET /api/v1/users/admin/groups HTTP/1.1" 200 OK
INFO:     127.0.0.1:54482 - "PUT /api/v1/users/admin/groups/cf696462-7c31-47ec-99b0-742592a53d60 HTTP/1.1" 200 OK
INFO:     127.0.0.1:54482 - "PUT /api/v1/users/admin/users/1df35bf4-2eec-414a-982d-280a6bd73be4/group HTTP/1.1" 200 OK
INFO:     127.0.0.1:54482 - "POST /api/v1/nodes/admin?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 409 Conflict
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Deregistered test-node-1
INFO:     127.0.0.1:54482 - "DELETE /api/v1/nodes/admin/test-node-1?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
INFO:app.api.routes.nodes:[admin] Created node 'test-node-1' by admin 1df35bf4-2eec-414a-982d-280a6bd73be4
[NodeRegistry] DB mark-offline failed for test-node-1: UPDATE statement on table 'agent_nodes' expected to update 1 row(s); 0 were matched.
INFO:     127.0.0.1:54482 - "POST /api/v1/nodes/admin?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
INFO:     127.0.0.1:54482 - "POST /api/v1/nodes/admin?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 409 Conflict
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Deregistered test-node-2
INFO:     127.0.0.1:54482 - "DELETE /api/v1/nodes/admin/test-node-2?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
INFO:app.api.routes.nodes:[admin] Created node 'test-node-2' by admin 1df35bf4-2eec-414a-982d-280a6bd73be4
[NodeRegistry] DB mark-offline failed for test-node-2: UPDATE statement on table 'agent_nodes' expected to update 1 row(s); 0 were matched.
INFO:     127.0.0.1:54482 - "POST /api/v1/nodes/admin?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
INFO:     127.0.0.1:54482 - "POST /api/v1/users/admin/groups HTTP/1.1" 409 Conflict
INFO:app.core.grpc.services.grpc_server:[gRPC] Incoming RPC Call: /agent.AgentOrchestrator/SyncConfiguration
INFO:app.core.grpc.services.grpc_server:[๐Ÿ”‘] SyncConfiguration REQUEST from test-node-2 (token prefix: DyWg...)
INFO:app.core.grpc.services.grpc_server:[gRPC] Incoming RPC Call: /agent.AgentOrchestrator/SyncConfiguration
INFO:app.core.grpc.services.grpc_server:[๐Ÿ”‘] SyncConfiguration REQUEST from test-node-1 (token prefix: odRS...)
INFO:app.core.grpc.services.grpc_server:[๐Ÿ”‘] Token validated for test-node-2 (owner: 1df35bf4-2eec-414a-982d-280a6bd73be4)
INFO:app.core.grpc.services.grpc_server:[๐Ÿ”‘] Handshake successful for test-node-2 (owner: 1df35bf4-2eec-414a-982d-280a6bd73be4)
INFO:app.core.grpc.services.grpc_server:[๐Ÿ”‘] Token validated for test-node-1 (owner: 1df35bf4-2eec-414a-982d-280a6bd73be4)
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Registered test-node-2 (owner: 1df35bf4-2eec-414a-982d-280a6bd73be4) | Stats enabled
INFO:app.core.grpc.services.grpc_server:[๐Ÿ”‘] Handshake successful for test-node-1 (owner: 1df35bf4-2eec-414a-982d-280a6bd73be4)
INFO:app.core.grpc.services.grpc_server:[gRPC] Incoming RPC Call: /agent.AgentOrchestrator/ReportHealth
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Registered test-node-1 (owner: 1df35bf4-2eec-414a-982d-280a6bd73be4) | Stats enabled
INFO:app.core.grpc.services.grpc_server:[gRPC] Incoming RPC Call: /agent.AgentOrchestrator/TaskStream
INFO:app.core.grpc.services.grpc_server:[gRPC] Incoming RPC Call: /agent.AgentOrchestrator/ReportHealth
INFO:app.core.grpc.services.grpc_server:[*] Node test-node-2 Attempting to establish TaskStream...
INFO:app.core.grpc.services.grpc_server:[*] Node test-node-2 Online (TaskStream established)
INFO:app.core.grpc.services.grpc_server:[gRPC] Incoming RPC Call: /agent.AgentOrchestrator/TaskStream
INFO:app.core.grpc.services.grpc_server:[*] Node test-node-1 Attempting to establish TaskStream...
INFO:app.core.grpc.services.grpc_server:[*] Node test-node-1 Online (TaskStream established)
INFO:app.api.routes.nodes:[admin] Created node 'test-coworker-sc1-5d3649d0' by admin 1df35bf4-2eec-414a-982d-280a6bd73be4
    [๐Ÿ“๐Ÿ”„] Triggering Resync Check for test-node-2...
    [๐Ÿ“๐Ÿ”„] Triggering Resync Check for test-node-1...
INFO:     127.0.0.1:57442 - "POST /api/v1/nodes/admin?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
INFO:app.core.grpc.services.assistant:[๐Ÿ“๐Ÿ“ค] Workspace agent_9b41bbca prepared on server for offline node test-coworker-sc1-5d3649d0
INFO:     127.0.0.1:57442 - "POST /api/v1/agents/deploy HTTP/1.1" 200 OK
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
    [๐Ÿ“๐Ÿงน] Running Mirror Cleanup. Active Sessions: 1
    [๐Ÿ“๐Ÿงน] Running Mirror Cleanup. Active Sessions: 1
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
21:39:32 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:app.app:[Health Check] System LLM statuses updated.
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "GET /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 405 Method Not Allowed
INFO:     127.0.0.1:57442 - "DELETE /api/v1/agents/e27756bf-ef70-4fb3-8b33-b298ebfc0dd1 HTTP/1.1" 200 OK
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Deregistered test-coworker-sc1-5d3649d0
INFO:     127.0.0.1:57442 - "DELETE /api/v1/nodes/admin/test-coworker-sc1-5d3649d0?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
INFO:app.api.routes.nodes:[admin] Created node 'test-coworker-sc3-8f159ab6' by admin 1df35bf4-2eec-414a-982d-280a6bd73be4
[NodeRegistry] DB mark-offline failed for test-coworker-sc1-5d3649d0: UPDATE statement on table 'agent_nodes' expected to update 1 row(s); 0 were matched.
INFO:     127.0.0.1:57424 - "POST /api/v1/nodes/admin?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
INFO:app.core.grpc.services.assistant:[๐Ÿ“๐Ÿ“ค] Workspace agent_2bbf2b00 prepared on server for offline node test-coworker-sc3-8f159ab6
INFO:     127.0.0.1:57424 - "POST /api/v1/agents/deploy HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents/b71161bc-8cb1-4d80-ad30-e263a6c43eb5/triggers HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "POST /api/v1/agents/b71161bc-8cb1-4d80-ad30-e263a6c43eb5/webhook?token=01172c5c6d338bdf4f48e5815227568e HTTP/1.1" 202 Accepted
WARNING:app.core.services.tool:Dynamic tool schema truncation failed to query model size: This model isn't mapped yet. model=gemini, custom_llm_provider=None. Add it here - https://github.com/BerriAI/litellm/blob/main/model_prices_and_context_window.json.
INFO:app.core.services.rag:[RAG] Mesh Context gathered. Length: 243 chars.
INFO:app.core.services.rag:[RAG] Mesh Context excerpt: Attached Agent Nodes (Infrastructure):
- Node ID: test-coworker-sc3-8f159ab6
  Name: Co-Worker SC-3 Node
  Description: No description provided.
  Status: offline
  Terminal Sandbox Mode: PERMISSIVE
 ...
WARNING:app.core.services.prompt:Prompt with slug 'rag-pipeline' not found.
INFO:root:[Architect] Starting autonomous loop (Turn 1). Prompt Size: 67 chars across 2 messages.
INFO:root:[Architect] Turn 1: Calling LLM (Messages: 2)
21:39:41 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= gemini-3-flash-preview; provider = gemini
INFO:LiteLLM:
LiteLLM completion() model= gemini-3-flash-preview; provider = gemini
[AgentExecutor] Task 4.2: Idempotency check for b71161bc-8cb1-4d80-ad30-e263a6c43eb5 in /tmp/cortex/agent_2bbf2b00/
[AgentExecutor] Starting run for b71161bc-8cb1-4d80-ad30-e263a6c43eb5 with provider 'gemini'. Prompt length: 3
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
INFO:     127.0.0.1:57424 - "GET /api/v1/agents HTTP/1.1" 200 OK
WARNING:app.core.grpc.services.grpc_server:Results listener closed for test-node-2: 
WARNING:app.core.grpc.services.grpc_server:Results listener closed for test-node-1: 
INFO:     127.0.0.1:57424 - "DELETE /api/v1/agents/b71161bc-8cb1-4d80-ad30-e263a6c43eb5 HTTP/1.1" 200 OK
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Deregistered test-coworker-sc3-8f159ab6
INFO:     127.0.0.1:57424 - "DELETE /api/v1/nodes/admin/test-coworker-sc3-8f159ab6?admin_id=1df35bf4-2eec-414a-982d-280a6bd73be4 HTTP/1.1" 200 OK
WARNING:app.core.grpc.services.grpc_server:[๐Ÿ“ถ] gRPC Stream TERMINATED for test-node-1. Cleaning up.
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Deregistered test-node-1
WARNING:app.core.grpc.services.grpc_server:[๐Ÿ“ถ] gRPC Stream TERMINATED for test-node-2. Cleaning up.
INFO:app.core.services.node_registry:[๐Ÿ“‹] NodeRegistry: Deregistered test-node-2
21:44:34 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
21:49:35 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
21:54:37 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
21:59:38 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:04:41 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:09:42 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:14:44 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:19:46 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:24:47 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:29:48 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:34:50 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:39:51 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:44:53 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:49:55 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:54:56 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
22:59:58 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:05:00 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:10:02 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:15:03 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:20:05 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:25:07 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:30:08 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:35:10 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:40:12 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:45:13 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:50:15 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
23:55:16 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:00:18 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:05:23 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:10:24 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:15:26 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:20:28 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:25:29 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:30:31 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:35:33 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:40:34 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:45:36 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:50:38 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
00:55:39 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:00:41 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:05:43 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:10:45 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:15:47 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:20:48 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:25:50 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:30:52 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
01:35:54 - LiteLLM:INFO: utils.py:3895 - 
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:LiteLLM:
LiteLLM completion() model= deepseek-chat; provider = deepseek
INFO:     Shutting down
INFO:     Waiting for application shutdown.
INFO:app.app:[M6] Stopping gRPC server...
INFO:     Application shutdown complete.
INFO:     Finished server process [14535]