Learn how to build a basic application using Langchain, Google Gemini, and Aporia Guardrails.
langchain
.
langchain-google-genai
.
ChatGoogleGenerativeAI
LLM from Langchain to initialize your model.
In this example you will use gemini-pro. To know more about the text model, read Google AI’s language documentation.
You can configure the model parameters such as temperature or top_p, by passing the appropriate values when creating the ChatGoogleGenerativeAI
LLM. To learn more about the parameters and their uses, read Google AI’s concepts guide.
langchain-aporia
pip package):
Aporia <> Langchain adapter code