Everyone else has ai, so why not OpenCore?

FaaS + LLM Generating Code Proof of Concept
FaaS isn’t production-ready yet, but that doesn’t mean we can’t build a proof-of-concept solution.
Here’s my first attempt at creating a FaaS interface—allowing OpenAI to create, deploy, and run functions for easy testing.

Video Description
Using OpenCore as is as a Function-as-a-Service platform. I built a simple chat interface that can ask an LLM to generate code, deploy it as a package to OpenCore, then run the code on an agent. Finally, it calls the function’s HTTP endpoint to validate that it works.
This way, the LLM can catch deployment failures, syntax errors, and runtime errors—and fix them automatically (as shown in the last example with the Chuck Norris API).

2 Likes

can you tell us more about FaaS? is it an AI Agent limited by system prompts to invoke functions/agents from inside opencore? Or is it doing simply individual API calls to OpenAI?

If it’s an Ai agent, maybe it needs more tweaking of the system prompt, to make sure the functions are being chained properly.

Also, I saw in another chat this FaaS thing is somehow a pre-requisite for upgrading OpenRpa to .Net core. Is that so?

In any case, looks interesting Allan.

FaaS means Function as a Service … also called “serverless functions.”
It is about abstracting away all the things needed to make code secure and scalable.
OpenCore kind of already supports that but for much more complex scenarios, and it lacks the ability to “scale to zero” so a function does not require resources when not in use.

One of the cool side effects of making code as simple as “fill out the body of this function with the logic I need” is that it makes it much easier for an LLM to generate the code for you. This is what I’m demonstrating here. I made a simple web page that uses a custom prompt and 3 AI tools, allowing the LLM to generate code, publish it, and then test it all on its own. This way, the LLM can fix its own errors without any user input.

2 Likes

replaced images with a video instead.