LiteLLM is an open-source proxy that gives your team a single endpoint for multiple LLM providers. You can point it at Consus Gateway so all requests route through government-authorized infrastructure.
For LiteLLM setup and deployment instructions, see the LiteLLM docs .
Authentication
Consus Gateway authenticates via the x-api-key header. LiteLLM’s built-in API key field is required but ignored by Consus Gateway. Your Consus API key must be passed via extra_headers.
Add models to your config.yaml. Use the openai/ prefix in the model field and pass your key via extra_headers:
Full config.yaml with all available models
model_list :
- model_name : claude-3-7-sonnet:il5+itar
litellm_params :
model : openai/claude-3-7-sonnet:il5+itar
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-sonnet-4-5:il5+itar
litellm_params :
model : openai/claude-sonnet-4-5:il5+itar
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-sonnet-4-5:fedramp-high
litellm_params :
model : openai/claude-sonnet-4-5:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-opus-4-6:fedramp-high
litellm_params :
model : openai/claude-opus-4-6:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-sonnet-4-6:fedramp-high
litellm_params :
model : openai/claude-sonnet-4-6:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-opus-4-5:fedramp-high
litellm_params :
model : openai/claude-opus-4-5:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-haiku-4-5:fedramp-high
litellm_params :
model : openai/claude-haiku-4-5:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-opus-4-1:fedramp-high
litellm_params :
model : openai/claude-opus-4-1:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-opus-4:fedramp-high
litellm_params :
model : openai/claude-opus-4:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : claude-sonnet-4:fedramp-high
litellm_params :
model : openai/claude-sonnet-4:fedramp-high
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : gemini-2-5-pro:il5
litellm_params :
model : openai/gemini-2-5-pro:il5
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : gemini-2-5-flash:il5
litellm_params :
model : openai/gemini-2-5-flash:il5
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : gpt-4.1:il5+itar
litellm_params :
model : openai/gpt-4.1:il5+itar
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
- model_name : gpt-4.1-mini:il5+itar
litellm_params :
model : openai/gpt-4.1-mini:il5+itar
api_base : https://api.consus.io/v1
api_key : dummy
extra_headers :
x-api-key : os.environ/CONSUS_API_KEY
Go to Models + Endpoints and click Add Model
Set Provider to OpenAI-Compatible Endpoints (Together AI, etc.)
Set LiteLLM Model Name to openai/<model>:<compliance-level> (e.g. openai/gpt-4.1:il5+itar)
Set API Base to https://api.consus.io/v1
Set OpenAI API Key to any value (required by LiteLLM, ignored by Consus)
Under Advanced Settings , set LiteLLM Params to:
{ "extra_headers" : { "x-api-key" : "YOUR_CONSUS_API_KEY" }}
Click Add Model
Do not use the Custom OpenAI provider. It prepends openai/ to the model name sent to Consus Gateway, causing a 400 error. Use OpenAI-Compatible Endpoints instead.
For the full list of available models and compliance levels, see the Model Explorer .