Newer
Older
cortex-hub / ai-hub / app / core / guide.md

🤖 LLM Provider Module Documentation

Location: app/core/providers.py This module provides a standardized and extensible interface for interacting with multiple Large Language Model (LLM) APIs. It enables pluggable support for models like DeepSeek and Gemini using a unified abstraction layer.


1. ⚙️ Configuration

The module behavior is controlled by environment variables, primarily for authentication and model selection.

📌 Environment Variables

Variable Description Default Value
DEEPSEEK_API_KEY Your DeepSeek API key (required) None
GEMINI_API_KEY Your Google Gemini API key (required) None
DEEPSEEK_MODEL_NAME Name of the DeepSeek model to use "deepseek-chat"
GEMINI_MODEL_NAME Name of the Gemini model to use "gemini-1.5-flash-latest"

💡 Tip: Set these variables in your .env file or environment before running the application.


2. 🧱 Core Components

The system uses a provider interface pattern to support multiple LLMs with a consistent API.

🧩 LLMProvider (Abstract Base Class)

  • Base class for all LLM providers.
  • Defines the required method:

    def generate_response(self, prompt: str) -> str
  • All concrete providers must implement this method.


🧠 DeepSeekProvider

  • Implements the LLMProvider interface.
  • Uses:
    • DEEPSEEK_API_KEY
    • DEEPSEEK_MODEL_NAME
  • Handles API calls to DeepSeek's LLM service.

🌟 GeminiProvider

  • Implements the LLMProvider interface.
  • Uses:
    • GEMINI_API_KEY
    • GEMINI_MODEL_NAME
  • Handles API calls to Google Gemini's LLM service.

🏭 get_llm_provider(model_name: str) -> LLMProvider

  • Factory function for obtaining a provider instance.

Usage:

provider = get_llm_provider("deepseek")
response = provider.generate_response("Tell me a joke.")

Supported values for model_name:

  • "deepseek"
  • "gemini"

Raises:

  • ValueError if an unsupported model_name is passed.

3. 🚀 Usage Example

Use the get_llm_provider() factory function to get a provider instance. Then call generate_response() on it:

from app.core.providers import get_llm_provider

# Get provider
llm = get_llm_provider("gemini")

# Generate response
response = llm.generate_response("What is the capital of France?")
print(response)

This design allows you to easily swap or extend LLMs (e.g., add OpenAI, Anthropic) by simply implementing a new provider class.