Inquiry about new "Chat" menu

The “chat” is simply the opposite of a GPT in chatgpt.
It allows you to chat with an LLM ( openai, google gemini or an open source model. It “works” for open source models, but you get 10 times better results with OpenAI … and it works quite well with gpt 3.5 so is pretty cheap ) and it then offers access to the same functions as he GPT’s can get as [explained here](( Using OpenAI gpts with openflow )).

The idea is, for people that cannot use cloud services, they can get the same functionality as people using chat gpt’s … and by hosting the chat in openflow, i can show more verbose information about what each function call does.
The chat is pretty basic, and is controlled by a message queue, and then all the “api talk with llm’s” is handled by a standard package running in an agent.

Anyone can enable it, but I have not shared the code for the agent, so it’s useless without the agent. My original idea was to see how popular this was, and then decide if I would open source it, or add it as a paid service, or just bundle it with a premium license.
But I only have 2 customers using it, and don’t see them paying for it any time soon. And the people on app.openiap.io is not really using it, most people just click on of the conversation starters. Once in a while someone ask’s non openflow questions, like help for code etc. so I’m not sure this is something people really wants. If you want to test it in your own local openflow, hit me up in a private message or on linkedin. But keep in mind, I might remove the chat interface from openflow again, if there is no use for it ( so right now it’s still alpha/beta stage )

2 Likes