OpenAI Integration
The following is an example showing how to integrate OpenAI into React ChatBotify. It leverages on the LLM Connector Plugin, which is maintained separately on the React ChatBotify Plugins organization. This example also taps on the OpenaiProvider, which ships by default with the LLM Connector Plugin. If you require support with the plugin, please reach out to support on the plugins discord instead.
The plugin also comes with other default providers, which you can try out in the LLM Conversation Example and Gemini Integration Example.
If you expect your LLM responses to contain markdown, consider using the Markdown Renderer Plugin as well!
This example uses 'direct' mode for demonstration purposes which exposes API keys client-side. In production, you should look to proxy your request and have your API keys stored server-side. A lightweight demo project for an LLM proxy can be found here. You may also refer to this article for more details.
const MyChatBot = () => { // openai api key, required since we're using 'direct' mode for testing let apiKey = ""; // initialize the plugin const plugins = [LlmConnector()]; // example flow for testing const flow: Flow = { start: { message: "Hello! Make sure you've set your API key before getting started!", options: ["I am ready!"], chatDisabled: true, path: async (params) => { if (!apiKey) { await params.simulateStreamMessage("You have not set your API key!"); return "start"; } await params.simulateStreamMessage("Ask away!"); return "openai"; }, }, openai: { llmConnector: { // provider configuration guide: // https://github.com/React-ChatBotify-Plugins/llm-connnector/blob/main/docs/providers/OpenAI.md provider: new OpenaiProvider({ mode: 'direct', model: 'gpt-4.1-nano', responseFormat: 'stream', apiKey: apiKey, }), outputType: 'character', }, }, }; return ( <ChatBot settings={{general: {embedded: true}, chatHistory: {storageKey: "example_openai_integration"}}} plugins={plugins} flow={flow} ></ChatBot> ); }; render(<MyChatBot/>)