cortex-hub / ai-hub / app / core / providers / llm /
@Antigravity AI Antigravity AI authored 2 days ago
..
README.md use litellm underlying 7 months ago
__init__.py add support for multi-models 8 months ago
general.py test: update gemini model to 1.5-flash in integration tests 2 days ago
README.md

The new LLM provider configuration is not necessary because we have migrated the setup to LiteLLM. This allows the program to handle requests for all different providers using a single, unified syntax, simplifying the codebase and making it easier to add new models.