RAG Chatbot: Embedchain + Chainlit
Learn how to build a streaming RAG chatbot with Embedchain, OpenAI, Chainlit for chat UI, and Aporia Guardrails.
Setup
Install required libraries:
Import libraries:
Build a RAG chatbot
When Chainlit starts, initialize a new Embedchain app using GPT-3.5 and streaming enabled.
This is where you can add documents to be used as knowledge for your RAG chatbot. For more information, see the Embedchain docs.
When a user writes a message in the chat UI, call the Embedchain RAG app:
To run the application, run:
Integrate Aporia Guardrails
Next, to integrate Aporia Guardrails, get your Aporia API Key and base URL per the OpenAI proxy documentation.
You can then add it like this to the Embedchain app from the configuration:
AGT Test
You can now test the integration using the AGT Test. Try this prompt:
Conclusion
That’s it. You have successfully created an LLM application using Embedchain, Chainlit, and Aporia.