Aporia Guardrails can be integrated into LLM-based applications using two distinct methods: the OpenAI Proxy and Aporia’s REST API.

Just getting started and use OpenAI or Azure OpenAI? Skip this guide and use the OpenAI proxy integration.

Method 1: OpenAI Proxy

Overview

In this method, Aporia acts as a proxy, forwarding your requests to OpenAI and simultaneously invoking guardrails. The returned response is either the original from OpenAI or a modified version enforced by Aporia’s policies.

This is the simplest option to get started with, especially if you use OpenAI or Azure OpenAI.

Key Features

  • Ease of Setup: Modify the base URL and add the X-APORIA-API-KEY header. In the case of Azure OpenAI, add also the X-AZURE-OPENAI-ENDPOINT header.
  • Streaming Support: Ideal for real-time applications and chatbots, fully supporting streaming.
  • LLM Provider Specific: Can only be used if the LLM provider is OpenAI or Azure OpenAI.

Ideal for those seeking a hassle-free setup with minimal changes, particularly when the LLM provider is OpenAI or Azure OpenAI.

Method 2: Aporia’s REST API

Overview

This approach involves making explicit calls to Aporia’s REST API at two key stages: before sending the prompt to the LLM to check for prompt-level policy violations (e.g. Prompt Injection) and after receiving the response to apply response-level guardrails (e.g. RAG Hallucinations).

Key Features

  • Detailed Feedback: Returns logs detailing which policies were triggered and what actions were taken.
  • Custom Actions: Enables the implementation of custom responses or actions instead of using the revised response provided by Aporia, offering flexibility in handling policy violations.
  • LLM Provider Flexibility: Any LLM is supported with this method (OpenAI, AWS Bedrock, Vertex AI, OSS models, etc.).

Suited for developers requiring detailed control over policy enforcement and customization, especially when using LLM providers other than OpenAI or Azure OpenAI.

Comparison of Methods

  • Simplicity vs. Customizability: The OpenAI Proxy offers simplicity for OpenAI users, whereas Aporia’s REST API offers flexible, detailed control suitable for any LLM provider.
  • Streaming Capabilities: Present in the OpenAI Proxy and planned for future addition to Aporia’s REST API.

If you’re just getting started, the OpenAI Proxy is recommended due to its straightforward setup. Developers requiring more control and detailed policy management should consider transitioning to Aporia’s REST API later on.