Ausnahmen (Exceptions)¶
llm_client.exceptions
¶
Custom exception classes for llm_client package.
Classes¶
APIKeyNotFoundError
¶
Bases: LLMClientError
Raised when required API key is missing.
This exception is raised when attempting to initialize a provider that requires an API key, but the key is not found in environment variables or passed explicitly.
Attributes:
| Name | Type | Description |
|---|---|---|
provider |
Name of the provider that requires the key. |
|
key_name |
Name of the environment variable for the API key. |
Examples:
>>> raise APIKeyNotFoundError("openai", "OPENAI_API_KEY")
APIKeyNotFoundError: OPENAI_API_KEY not found for openai provider.
Please set it in environment or pass explicitly.
Source code in llm_client/exceptions.py
Functions¶
__init__(provider, key_name)
¶
Initialize the exception.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
str
|
Name of the provider (e.g., 'openai', 'groq'). |
required |
key_name
|
str
|
Name of the required environment variable. |
required |
Source code in llm_client/exceptions.py
ChatCompletionError
¶
Bases: LLMClientError
Raised when chat completion request fails.
This exception is raised when an API call to the LLM provider fails after all retry attempts have been exhausted.
Attributes:
| Name | Type | Description |
|---|---|---|
provider |
Name of the provider that failed. |
|
original_error |
The original exception that caused the failure. |
Examples:
>>> raise ChatCompletionError("openai", ConnectionError("Network error"))
ChatCompletionError: Chat completion failed for openai provider:
Network error
Source code in llm_client/exceptions.py
Functions¶
__init__(provider, original_error)
¶
Initialize the exception.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
str
|
Name of the provider where the error occurred. |
required |
original_error
|
Exception
|
The original exception that was raised. |
required |
Source code in llm_client/exceptions.py
FileUploadNotSupportedError
¶
Bases: LLMClientError
Raised when file upload is requested but not supported.
This exception is raised when attempting to upload files to a provider that doesn't support file/multimodal inputs.
Attributes:
| Name | Type | Description |
|---|---|---|
provider |
Name of the provider. |
|
file_type |
Type of file that was attempted to upload. |
Examples:
>>> raise FileUploadNotSupportedError("groq", "pdf")
FileUploadNotSupportedError: File upload not supported for groq provider.
Supported file type: pdf
Source code in llm_client/exceptions.py
Functions¶
__init__(provider, file_type=None)
¶
Initialize the exception.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
str
|
Name of the provider. |
required |
file_type
|
str | None
|
Optional file type that was attempted. |
None
|
Source code in llm_client/exceptions.py
InvalidProviderError
¶
Bases: LLMClientError
Raised when an invalid provider name is specified.
This exception is raised when attempting to create or switch to a provider that doesn't exist.
Attributes:
| Name | Type | Description |
|---|---|---|
provider |
The invalid provider name that was specified. |
|
valid_providers |
List of valid provider names. |
Examples:
>>> raise InvalidProviderError("invalid", ["openai", "groq"])
InvalidProviderError: Invalid provider: invalid.
Valid providers are: openai, groq
Source code in llm_client/exceptions.py
Functions¶
__init__(provider, valid_providers)
¶
Initialize the exception.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
str
|
The invalid provider name. |
required |
valid_providers
|
list[str]
|
List of valid provider names. |
required |
Source code in llm_client/exceptions.py
LLMClientError
¶
Bases: Exception
Base exception for LLM Client.
All custom exceptions in the llm_client package inherit from this class. This allows users to catch all llm_client-specific errors with a single except clause.
Examples:
>>> try:
... client = LLMClient(api_choice="openai")
... except LLMClientError as e:
... print(f"LLM Client error: {e}")
Source code in llm_client/exceptions.py
ProviderNotAvailableError
¶
Bases: LLMClientError
Raised when requested provider is not available.
This exception is raised when attempting to use a provider whose required package is not installed.
Attributes:
| Name | Type | Description |
|---|---|---|
provider |
Name of the provider that is not available. |
|
package_name |
Name of the required package. |
Examples:
>>> raise ProviderNotAvailableError("groq", "groq")
ProviderNotAvailableError: groq provider not available.
Install with: pip install groq
Source code in llm_client/exceptions.py
Functions¶
__init__(provider, package_name)
¶
Initialize the exception.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
str
|
Name of the provider (e.g., 'openai', 'groq'). |
required |
package_name
|
str
|
Name of the required pip package. |
required |
Source code in llm_client/exceptions.py
StreamingNotSupportedError
¶
Bases: LLMClientError
Raised when streaming is requested but not supported.
This exception is raised when attempting to use streaming with a provider or configuration that doesn't support it.
Attributes:
| Name | Type | Description |
|---|---|---|
provider |
Name of the provider. |
|
reason |
Optional reason why streaming is not supported. |
Examples:
>>> raise StreamingNotSupportedError("custom_provider")
StreamingNotSupportedError: Streaming not supported for custom_provider
Source code in llm_client/exceptions.py
Functions¶
__init__(provider, reason=None)
¶
Initialize the exception.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
str
|
Name of the provider. |
required |
reason
|
str | None
|
Optional explanation of why streaming isn't supported. |
None
|