OpenRPA uses an embedded Python 3.7.3 version, and some libraries I need (like google generativeai) are not fully compatible with it. To work around this, I created a complete standalone Python script (a wrapped resume parser) that:
Takes a resume file path as input,
Handles PDF conversion and image extraction externally,
Calls the Gemini API to parse multi-page resumes,
Outputs the parsed result as a JSON object.
I tested this code successfully using VS Code with Python 3.10.11.
I also tried to change python higher version path in openRPA script setting but didn’t work.
Now, I want to call this external Python script from OpenRPA so it runs independently and returns the JSON output directly to OpenRPA, which I can then use in my workflow.
My questions:
What’s the best way to call and capture output from this external Python script in OpenRPA?
You can always spawn a normal Python executable using Start Process and give it the path to the script. A common way to then send/receive data from the script is to use a file on disk, but this is not ideal.
The best way to run Python (or any scripting/coding language) is by using agents.
You can run an agent inside OpenCore or remotely, and schedule the code to run on the agent using a few different methods.
If the code is designed to run all the time, like a web server, or some code waiting on something, like items in a work item queue, you can run the code as a Daemon.
If the code is designed to be run at intervals without arguments, you can schedule it to run using a CRON schedule in the OpenCore web interface.
In a few weeks, when we launch serverless, you can trigger it using a REST call or a message queue message.
I sometimes upload multiple versions of OpenRPA with different versions of the embedded Python.
For instance, on the latest version, I have 3.7 and 3.11. If your script also works with 3.11, you could try that. If not, then yes, if you have the environment for it, you can build OpenRPA from source code and change the NuGet package for Python embedded to the one you want.