Location:
app/core/providers.pyThis module provides a standardized and extensible interface for interacting with multiple Large Language Model (LLM) APIs. It enables pluggable support for models like DeepSeek and Gemini using a unified abstraction layer.
The module behavior is controlled by environment variables, primarily for authentication and model selection.
| Variable | Description | Default Value |
|---|---|---|
DEEPSEEK_API_KEY |
Your DeepSeek API key (required) | None |
GEMINI_API_KEY |
Your Google Gemini API key (required) | None |
DEEPSEEK_MODEL_NAME |
Name of the DeepSeek model to use | "deepseek-chat" |
GEMINI_MODEL_NAME |
Name of the Gemini model to use | "gemini-1.5-flash-latest" |
💡 Tip: Set these variables in your
.envfile or environment before running the application.
The system uses a provider interface pattern to support multiple LLMs with a consistent API.
LLMProvider (Abstract Base Class)Defines the required method:
def generate_response(self, prompt: str) -> str
All concrete providers must implement this method.
DeepSeekProviderLLMProvider interface.DEEPSEEK_API_KEYDEEPSEEK_MODEL_NAMEGeminiProviderLLMProvider interface.GEMINI_API_KEYGEMINI_MODEL_NAMEget_llm_provider(model_name: str) -> LLMProviderprovider = get_llm_provider("deepseek")
response = provider.generate_response("Tell me a joke.")
model_name:"deepseek""gemini"ValueError if an unsupported model_name is passed.Use the get_llm_provider() factory function to get a provider instance. Then call generate_response() on it:
from app.core.providers import get_llm_provider
# Get provider
llm = get_llm_provider("gemini")
# Generate response
response = llm.generate_response("What is the capital of France?")
print(response)
This design allows you to easily swap or extend LLMs (e.g., add OpenAI, Anthropic) by simply implementing a new provider class.