Skip to main content

Secret Manager

LiteLLM supports reading secrets from Azure Key Vault and Infisical

AWS Key Management Service​

Use AWS KMS to storing a hashed copy of your Proxy Master Key in the environment.

export LITELLM_MASTER_KEY="djZ9xjVaZ..." # 👈 ENCRYPTED KEY
export AWS_REGION_NAME="us-west-2"
general_settings:
key_management_system: "aws_kms"
key_management_settings:
hosted_keys: ["LITELLM_MASTER_KEY"] # 👈 WHICH KEYS ARE STORED ON KMS

See Decryption Code

AWS Secret Manager​

Store your proxy keys in AWS Secret Manager.

Proxy Usage​

  1. Save AWS Credentials in your environment
os.environ["AWS_ACCESS_KEY_ID"] = ""  # Access key
os.environ["AWS_SECRET_ACCESS_KEY"] = "" # Secret access key
os.environ["AWS_REGION_NAME"] = "" # us-east-1, us-east-2, us-west-1, us-west-2
  1. Enable AWS Secret Manager in config.
general_settings:
master_key: os.environ/litellm_master_key
key_management_system: "aws_secret_manager" # 👈 KEY CHANGE
key_management_settings:
hosted_keys: ["litellm_master_key"] # 👈 Specify which env keys you stored on AWS
  1. Run proxy
litellm --config /path/to/config.yaml

Azure Key Vault​

Quick Start​

### Instantiate Azure Key Vault Client ###
from azure.keyvault.secrets import SecretClient
from azure.identity import ClientSecretCredential

# Set your Azure Key Vault URI
KVUri = os.getenv("AZURE_KEY_VAULT_URI")

# Set your Azure AD application/client ID, client secret, and tenant ID - create an application with permission to call your key vault
client_id = os.getenv("AZURE_CLIENT_ID")
client_secret = os.getenv("AZURE_CLIENT_SECRET")
tenant_id = os.getenv("AZURE_TENANT_ID")

# Initialize the ClientSecretCredential
credential = ClientSecretCredential(client_id=client_id, client_secret=client_secret, tenant_id=tenant_id)

# Create the SecretClient using the credential
client = SecretClient(vault_url=KVUri, credential=credential)

### Connect to LiteLLM ###
import litellm
litellm.secret_manager = client

litellm.get_secret("your-test-key")

Usage with OpenAI Proxy Server​

  1. Install Proxy dependencies
pip install 'litellm[proxy]' 'litellm[extra_proxy]'
  1. Save Azure details in your environment
export["AZURE_CLIENT_ID"]="your-azure-app-client-id"
export["AZURE_CLIENT_SECRET"]="your-azure-app-client-secret"
export["AZURE_TENANT_ID"]="your-azure-tenant-id"
export["AZURE_KEY_VAULT_URI"]="your-azure-key-vault-uri"
  1. Add to proxy config.yaml
model_list: 
- model_name: "my-azure-models" # model alias
litellm_params:
model: "azure/<your-deployment-name>"
api_key: "os.environ/AZURE-API-KEY" # reads from key vault - get_secret("AZURE_API_KEY")
api_base: "os.environ/AZURE-API-BASE" # reads from key vault - get_secret("AZURE_API_BASE")

general_settings:
key_management_system: "azure_key_vault"

You can now test this by starting your proxy:

litellm --config /path/to/config.yaml

Quick Test Proxy

Google Key Management Service​

Use encrypted keys from Google KMS on the proxy

Usage with OpenAI Proxy Server​

Step 1. Add keys to env​

export GOOGLE_APPLICATION_CREDENTIALS="/path/to/credentials.json"
export GOOGLE_KMS_RESOURCE_NAME="projects/*/locations/*/keyRings/*/cryptoKeys/*"
export PROXY_DATABASE_URL_ENCRYPTED=b'\n$\x00D\xac\xb4/\x8e\xc...'

Step 2: Update Config​

general_settings:
key_management_system: "google_kms"
database_url: "os.environ/PROXY_DATABASE_URL_ENCRYPTED"
master_key: sk-1234

Step 3: Start + test proxy​

$ litellm --config /path/to/config.yaml

And in another terminal

$ litellm --test 

Quick Test Proxy

Infisical Secret Manager​

Integrates with Infisical's Secret Manager for secure storage and retrieval of API keys and sensitive data.

Usage​

liteLLM manages reading in your LLM API secrets/env variables from Infisical for you

import litellm
from infisical import InfisicalClient

litellm.secret_manager = InfisicalClient(token="your-token")

messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What's the weather like today?"},
]

response = litellm.completion(model="gpt-3.5-turbo", messages=messages)

print(response)

.env Files​

If no secret manager client is specified, Litellm automatically uses the .env file to manage sensitive data.