LiteLLM Self-Hosted Security & Encryption FAQ
Data in Transit Encryptionโ
Does the product encrypt data in transit?โ
Yes, LiteLLM encrypts data in transit using TLS/SSL.
Available in both OSS and Enterprise?โ
Yes, TLS encryption is available in both Open Source and Enterprise versions.
In transit between the calling client and the product?โ
Yes, HTTPS/TLS is supported through SSL certificate configuration.
Configuration:
# CLI
litellm --ssl_keyfile_path /path/to/key.pem --ssl_certfile_path /path/to/cert.pem
# Environment Variables
export SSL_KEYFILE_PATH="/path/to/key.pem"
export SSL_CERTFILE_PATH="/path/to/cert.pem"
Documentation Reference: docs/my-website/docs/guides/security_settings.md
In transit between the product and the LLM providers?โ
Yes, all connections to LLM providers use TLS encryption by default.
Implementation Details:
- Uses Python's
ssl.create_default_context()
- Leverages HTTPX and aiohttp libraries with SSL/TLS enabled
- Uses certifi CA bundle by default for SSL verification
Code Reference: litellm/llms/custom_httpx/http_handler.py
(lines 43-105)
Are TCP sessions to the LLM providers shared?โ
Yes, TCP connections are pooled and reused.
Details:
- Connection pooling is enabled by default
- Default: 1000 max concurrent connections with keepalive
- Sessions are maintained across requests to the same provider
- Reduces overhead of TLS handshakes
Code Reference: litellm/llms/custom_httpx/http_handler.py
(lines 704-712)
Or does the product negotiate a new TLS session with the same LLM provider for every sequential call?โ
No, TLS sessions are reused through connection pooling. New TLS handshakes are not performed for every request.
How is it encrypted?โ
TLS 1.2 and TLS 1.3
Uses Python's default SSL context which supports both TLS 1.2 and TLS 1.3. The specific version negotiated depends on:
- Python version
- System SSL library (typically OpenSSL)
- Server capabilities
Implementation: ssl.create_default_context()
in Python
How are these added to the product's configuration?โ
x.509 Certificateโ
Method 1: CLI Arguments
litellm --ssl_certfile_path /path/to/certificate.pem
Method 2: Environment Variable
export SSL_CERTFILE_PATH="/path/to/certificate.pem"
Private Keyโ
Method 1: CLI Arguments
litellm --ssl_keyfile_path /path/to/private_key.pem
Method 2: Environment Variable
export SSL_KEYFILE_PATH="/path/to/private_key.pem"
Certificate Bundle/Chainโ
For client-to-proxy connections: Use standard SSL certificate setup with intermediate certificates bundled in the certfile.
For proxy-to-LLM provider connections:
Method 1: Config YAML
litellm_settings:
ssl_verify: "/path/to/ca_bundle.pem"
Method 2: Environment Variable
export SSL_CERT_FILE="/path/to/ca_bundle.pem"
Method 3: Client Certificate Authentication
litellm_settings:
ssl_certificate: "/path/to/client_certificate.pem"
or
export SSL_CERTIFICATE="/path/to/client_certificate.pem"
Documentation Coverageโ
Primary Documentation:
docs/my-website/docs/guides/security_settings.md
- SSL/TLS configuration guide
Additional References:
litellm/proxy/proxy_cli.py
(lines 455-467) - CLI optionsdocs/my-website/docs/completion/http_handler_config.md
- Custom HTTP handler configuration
Data at Rest Encryptionโ
Does the product encrypt data at rest?โ
Partially. Only specific sensitive data is encrypted at rest.
What data is stored in encrypted form?โ
Encrypted Data:โ
- LLM API Keys - Model credentials in
LiteLLM_ProxyModelTable.litellm_params
- Provider Credentials - Stored in
LiteLLM_CredentialsTable.credential_values
- Configuration Secrets - Sensitive config values in
LiteLLM_Config
table - Virtual Keys - When using secret managers (optional feature)
NOT Encrypted:โ
- Spend Logs - Request/response data in
LiteLLM_SpendLogs
- Audit Logs - Change history in
LiteLLM_AuditLog
- User/Team/Organization Data - Metadata and configuration
- Cached Prompts and Completions - Cache data is stored in plaintext
Cached prompts and completions?โ
No, cached prompts and completions are NOT encrypted.
Cache backends (Redis, S3, local disk) store data as plaintext JSON.
Code References:
litellm/caching/redis_cache.py
litellm/caching/s3_cache.py
litellm/caching/caching.py
Configuration data?โ
Partially encrypted.
What IS Encrypted:โ
- LLM API keys and credentials in model configurations
- Sensitive values in
LiteLLM_Config
table - Credential values in
LiteLLM_CredentialsTable
What is NOT Encrypted:โ
- Model names and aliases
- Rate limits and budget settings
- User/team/organization metadata
- Non-sensitive configuration parameters
Code Reference: litellm/proxy/management_endpoints/model_management_endpoints.py
(lines 275-308)
Log data?โ
No, log data is NOT encrypted.
Log data stored in database tables is in plaintext:
LiteLLM_SpendLogs
- Contains request/response data, tokens, spendLiteLLM_ErrorLogs
- Error informationLiteLLM_AuditLog
- Audit trail of changes
Note: You can disable logging to avoid storing sensitive data:
general_settings:
disable_spend_logs: True # Disable writing spend logs to DB
disable_error_logs: True # Disable writing error logs to DB
Documentation: docs/my-website/docs/proxy/db_info.md
(lines 52-60)
Where is it stored?โ
In the DB?โ
Yes, encrypted data is stored in PostgreSQL database.
Key Tables with Encrypted Data:
LiteLLM_ProxyModelTable
- Model configurations with encrypted API keysLiteLLM_CredentialsTable
- Credential valuesLiteLLM_Config
- Configuration secrets
Schema Reference: schema.prisma
In the filesystem?โ
No, encrypted data is not stored in the filesystem by default.
Note: If using disk cache (disk_cache_dir
), cached data is stored unencrypted.
Somewhere else?โ
Optional: When using secret managers (AWS Secrets Manager, Azure Key Vault, HashiCorp Vault), encrypted data can be stored externally.
Configuration:
general_settings:
key_management_system: "aws_secret_manager" # or "azure_key_vault", "hashicorp_vault"
Documentation: docs/my-website/docs/secret.md
How is it encrypted?โ
Algorithm: NaCl SecretBox (XSalsa20-Poly1305 AEAD)
NOT AES-256 - LiteLLM uses NaCl (Networking and Cryptography Library) which provides:
- XSalsa20 stream cipher
- Poly1305 MAC for authentication
- Equivalent security to AES-256
Key Derivation:
- Takes
LITELLM_SALT_KEY
(orLITELLM_MASTER_KEY
if salt key not set) - Hashes with SHA-256 to derive 256-bit encryption key
- Uses NaCl SecretBox for authenticated encryption
Code Reference: litellm/proxy/common_utils/encrypt_decrypt_utils.py
(lines 69-112)
Implementation:
import hashlib
import nacl.secret
# Derive 256-bit key from salt
hash_object = hashlib.sha256(signing_key.encode())
hash_bytes = hash_object.digest()
# Create SecretBox and encrypt
box = nacl.secret.SecretBox(hash_bytes)
encrypted = box.encrypt(value_bytes)
Setting the Encryption Keyโ
Required Environment Variable:
export LITELLM_SALT_KEY="your-strong-random-key-here"
Important Notes:
- โ ๏ธ Must be set before adding any models
- โ ๏ธ Never change this key - encrypted data becomes unrecoverable
- โ ๏ธ Use a strong random key (recommended: https://1password.com/password-generator/)
- If not set, falls back to
LITELLM_MASTER_KEY
Documentation: docs/my-website/docs/proxy/prod.md
(section 8, lines 184-196)
Documentation Coverageโ
Primary Documentation:
docs/my-website/docs/proxy/prod.md
(section 8) - LITELLM_SALT_KEY setupdocs/my-website/docs/secret.md
- Secret management systemsdocs/my-website/docs/proxy/db_info.md
- Database information
Additional References:
security.md
- General security measuresdocs/my-website/docs/data_security.md
- Data privacy overviewschema.prisma
- Database schema with encrypted fields
Summary of Security Featuresโ
โ Provided Out of the Boxโ
- TLS/SSL encryption for client-to-proxy connections
- TLS encryption for proxy-to-LLM provider connections (with connection pooling)
- Encrypted storage of LLM API keys and credentials
- Support for TLS 1.2 and TLS 1.3
- Connection pooling to reduce TLS handshake overhead
โ ๏ธ Important Limitationsโ
- Cached data is NOT encrypted (Redis, S3, disk cache)
- Log data is NOT encrypted (spend logs, audit logs)
- Request/response payloads in logs are NOT encrypted
- Uses NaCl SecretBox, NOT AES-256 (equivalent security)
- TLS version not explicitly configured - uses Python/system defaults
๐ง Configuration Requirementsโ
For Production Deployments:
- Set LITELLM_SALT_KEY before adding any models
- Configure SSL certificates for HTTPS client connections
- Consider disabling logs if they contain sensitive data
- Use secret managers for enhanced security (optional)
- Configure CA bundles if using custom certificates
Quick Start Security Checklistโ
# 1. Generate a strong salt key
export LITELLM_SALT_KEY="$(openssl rand -base64 32)"
# 2. Set up SSL certificates (for HTTPS)
export SSL_KEYFILE_PATH="/path/to/private_key.pem"
export SSL_CERTFILE_PATH="/path/to/certificate.pem"
# 3. Configure database
export DATABASE_URL="postgresql://user:password@host:port/dbname"
# 4. (Optional) Disable logs if they contain sensitive data
# Add to config.yaml:
# general_settings:
# disable_spend_logs: True
# disable_error_logs: True
# 5. Start LiteLLM Proxy
litellm --config config.yaml
Additional Resourcesโ
- LiteLLM Documentation: https://docs.litellm.ai/
- Security Settings Guide: https://docs.litellm.ai/docs/guides/security_settings
- Production Deployment: https://docs.litellm.ai/docs/proxy/prod
- Secret Management: https://docs.litellm.ai/docs/secret
For security inquiries: support@berri.ai