Overview
By integrating Aporia with your AI Gateway, every new LLM-based application gets out-of-the-box guardrails. Teams can then add custom policies for their project.
What is an AI Gateway?
An AI Gateway (or LLM Gateway) is a centralized proxy for LLM-based applications within an organization. This setup enhances governance, management, and control for enterprises.
By routing LLM requests through a centralized gateway rather than directly to LLM providers, you gain multiple benefits:
- Less vendor lock-in: Facilitates easier migrations between different LLM providers.
- Cost control: Manage and monitor expenses on a team-by-team basis.
- Rate limit control: Enforces request limits on a team-by-team basis.
- Retries & Caching: Improves performance and reliability of LLM calls.
- Analytics: Provides insights into usage patterns and operational metrics.
Aporia Guardrails & AI Gateways
Aporia Guardrails is a great fit for AI Gateways: every new LLM app automatically gets default out-of-the-box guardrails for hallucinations, inappropriate responses, prompt injections, data leakage, and more.
If a specific team needs to customize guardrails for their project, they can log-in to the Aporia dashboard and edit the different policies.
Specific integration examples:
If you’re using an AI Gateway not listed here, please contact us at support@aporia.com. We’d be happy to add more examples!