AI Gateway integration
LiteLLM integration
LiteLLM is an open-source AI gateway. For more information on integrating Aporia with AI gateways, see this guide.
Integration Guide
Installation
To configure LiteLLM with Aporia, start by installing LiteLLM:
For more details, visit LiteLLM - Getting Started guide.
Use LiteLLM AI Gateway with Aporia Guardrails
In this tutorial we will use LiteLLM Proxy with Aporia to detect PII in requests.
1. Setup guardrails on Aporia
Pre-Call: Detect PII
Add the PII - Prompt
to your Aporia project.
2. Define Guardrails on your LiteLLM config.yaml
- Define your guardrails under the
guardrails
section and setpre_call_guardrails
Supported values for mode
pre_call
Run before LLM call, on inputpost_call
Run after LLM call, on input & outputduring_call
Run during LLM call, on input
3. Start LiteLLM Gateway
4. Test request
Expect this to fail since since ishaan@berri.ai
in the request is PII
Expected response on failure
5. Control Guardrails per Project (API Key)
Use this to control what guardrails run per project. In this tutorial we only want the following guardrails to run for 1 project (API Key)
guardrails
: [“aporia-pre-guard”, “aporia”]
Step 1 Create Key with guardrail settings
Step 2 Test it with new key