I recently noticed a new feature on “https://app.openiap.io” platform titled “Chat” and I’m curious to learn more about it. Specifically, I’m wondering if this functionality is connected to ChatGPT, the AI language model developed by OpenAI. If so, how exactly does it work?
Also, I’m interested to know: Would it be possible to access and implement the “Chat” functionality within my own OpenFlow setup?
Any insights or guidance on how to achieve this would be greatly appreciated.
The “chat” is simply the opposite of a GPT in chatgpt.
It allows you to chat with an LLM ( openai, google gemini or an open source model. It “works” for open source models, but you get 10 times better results with OpenAI … and it works quite well with gpt 3.5 so is pretty cheap ) and it then offers access to the same functions as he GPT’s can get as [explained here](( Using OpenAI gpts with openflow )).
The idea is, for people that cannot use cloud services, they can get the same functionality as people using chat gpt’s … and by hosting the chat in openflow, i can show more verbose information about what each function call does.
The chat is pretty basic, and is controlled by a message queue, and then all the “api talk with llm’s” is handled by a standard package running in an agent.
Anyone can enable it, but I have not shared the code for the agent, so it’s useless without the agent. My original idea was to see how popular this was, and then decide if I would open source it, or add it as a paid service, or just bundle it with a premium license.
But I only have 2 customers using it, and don’t see them paying for it any time soon. And the people on app.openiap.io is not really using it, most people just click on of the conversation starters. Once in a while someone ask’s non openflow questions, like help for code etc. so I’m not sure this is something people really wants. If you want to test it in your own local openflow, hit me up in a private message or on linkedin. But keep in mind, I might remove the chat interface from openflow again, if there is no use for it ( so right now it’s still alpha/beta stage )
It’s great to hear how it integrates with different LLMs like OpenAI and Google Gemini.
Although it’s not widely used yet and that it’s still in the alpha/beta stage, I see potential in it, especially for understanding interactions with LLMs in OpenFlow.
I’m interested in testing this feature in my local OpenFlow setup and defenitly I will reach outfor more details