When Chainlit starts, initialize a new Embedchain app using GPT-3.5 and streaming enabled.This is where you can add documents to be used as knowledge for your RAG chatbot. For more information, see the Embedchain docs.
Copy
Ask AI
@cl.on_chat_startasync def chat_startup(): app = App.from_config(config={ "app": { "config": { "name": "my-chatbot", "id": str(uuid.uuid4()), "collect_metrics": False } }, "llm": { "config": { "model": "gpt-3.5-turbo-0125", "stream": True, "temperature": 0.0, } } }) # Add documents to be used as knowledge base for the chatbot app.add("my_knowledge.pdf", data_type='pdf_file') cl.user_session.set("app", app)
When a user writes a message in the chat UI, call the Embedchain RAG app:
Copy
Ask AI
@cl.on_messageasync def on_new_message(message: cl.Message): app = cl.user_session.get("app") msg = cl.Message(content="") for chunk in await cl.make_async(app.chat)(message.content): await msg.stream_token(chunk) await msg.send()
Next, to integrate Aporia Guardrails, get your Aporia API Key and base URL per the OpenAI proxy documentation.You can then add it like this to the Embedchain app from the configuration: