How to Send Data to Multiple Bots on Different Computers Using Node-RED?

Hi guys,

I have a code that retrieves a row of data through an API. Then, it splits this row into two lists to divide the data into two parts. I would like to send these two lists to two bots located on different computers. How can I achieve this using Node-RED’s orchestrator?

Thanks.

That is an excellent question.

  1. The quick and dirty (and not recommended way) would be to split up the data and then send each part of the data to a separate RPA node in Node-RED that executes the same workflow with different data on each machine.

  2. OR, you could create a role in OpenFlow, check “RPA” on the role, add the 2 users for the robots into the role. Then log out of both robots and log back in to force updating role memberships. This will make both robots listen for tasks sent to the role. Now, inside Node-RED, you can send the data to an RPA node that uses a role instead of a user. This will load balance the request between the robots, evenly distributing the work between them. So you just send the 2 parts of the data to the same RPA node, and If one is busy, it will try the next one. If you later need more robots to help with the work, you can simply add more robot users to the role. This used to be the recommended way in the old days, but

  3. Nowadays, a much better and more robust way is to use work item queues. Look at the documentation and/or watch a video about what work items are and how they work. Then use the example or create your own implementation of a REFramework workflow that wraps the code to pop a work item, call a processing workflow, and then update the state of the workitem. Create a role with RPA checked and add all the robot users to the role. Create a work item queue. On the work item queue, select the REFramework workflow and the role you just created. This will tell OpenFlow to ask one of the robots to run the workflow whenever there are new items in the work item queue that need processing. Using work items makes it much easier to see the progress, retry if it fails, even if you need to edit some of the data before retrying. It will ensure transactional security around the data and state, and it will allow you to easily move away from using robots if you later decide to use pure code or Node-RED workflows.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.