Should I implement a development environment for OpenRPA and OpenFlow?

Hi everyone. I’m new to OpenIAP community. Recently our team decided to implement RPA into our processed. I’m major in developing software so my mindset might be sticked to the software development workflow a little bit, please correct me.

Our team is thinking about implement 2 environment for working with OpenRPA and OpenFlow: the development environment and the production environment.

  • The development environment: consists of all the machines of devs in the team and is used for developing the OpenRPA flows. The flows will be submitted to the OpenFlow development environment for testing before being brought to production
  • The production environment: consists of only one machine for performing the flows tested and approved in the development stages and, connected with one OpenFlow production environment and other required system.

Should we do this? Why/why not? If we can do this, then my biggest questions here is how to move all the accepted flows from the develop to the production environment (automatically, and manually for the worst case).

I appreciate any suggestions and help. Thanks in advance.

That is a very highly recommended way to do it, yes.

Moving workflows from dev to prod has been a little bit of a pain, so far ( since that is most easily done using powershell ) but I’m currently working on a snapshot feature that uses github to snapshot/restore any elements from the database into git repositories.
Once that is done, you can update production simply by pushing items using git and clicking a restore button inside openflow

1 Like

I tried looking for an example and google doesn’t like me today…
Do you think we could wrap it (the powershell part) into an OpenRPA workflow? I could take a stab at it, and PR it to your examples repo, but for the life of me I can’t find the powershell example you’re talking about.

That way a dev could run a PushToProd (maybe it would need a form popup or something similar to select what exactly to push), without leaving the place they’re at.
Any required credentials to do the actual push can then be fetched as standard credentials from OpenFlow, which also gives a relatively nice mechanism to control who can push or not.

While this is obviously just a workaround compared to what you’re working on, it could nudge users into separating dev and prod, which is definitely the way to go with how the updates get propagated through.

When you install OpenRPA, using default setting it adds a powershell module called openrpa

Get-Command -Module openrpa

with this you can query and manipulate any data in openflow
for instance

Get-Entity -Collection openrpa -Query '{"_type":"workflow"}'

will get all openrpa workflows, problem is, this will not have all images inside the workflows, so just for openrpa workflows there is a special Export-OpenRPAWorkflow you can pipe the result into, this will download all images for each workflow and embed them into the workflow, so when you import them, openrpa can extract them from the workflow and save then into gridfs again

Get-Entity -Collection openrpa -Query '{"_type":"workflow"}' | Export-OpenRPAWorkflow  -Force

this will use the filename from the workflow, so it might override workflows with the same name from different projects. To fix this you might want to start by getting each project, and then export each project to it’s own folder, something like

$projects = Get-Entity -Collection openrpa -Query '{"_type":"project"}'
foreach ($p in $projects)
  $workflows = Get-Entity -Collection openrpa -Query "{`"_type`":`"workflow`", `"projectid`":`"$($p._id)`"}";
  $folderPath = Join-Path -Path $pwd.Path -ChildPath $
  if (-Not (Test-Path -Path $folderPath)) {
    New-Item -Path $folderPath -ItemType Directory
  $workflows | Export-OpenRPAWorkflow -Force -folder $folderPath
1 Like

Many thanks for the replies. May I ask some more about exporting the workflows?

  1. Supposing the workflow consists of some NodeRed triggers, what should we do to copy the NodeRed configurations from devs to prod environment along with the OpenRPA workflows?
  2. As you mentioned, we can workaround this at the moment with Powershell. So this means that we should do something manually to mark the workflow as “Ready for Production”, then schedule a PowerShell script to automatically export and transfer the workflow to prod env, is that right?

All objects related to a specefic NodeRED ( flows, credentials, sessions, npmrc and so on ) will be saved in the nodered collection with “nodered_id” set to the id.
So you can easily export all the objects, using

$nodered = Get-Entity -Collection nodered -Query '{"nodered_id":"my-nodered"}'

By default powershell will login using the information in settings.json ( you can validate that using get-CurrentUser )
and you can change to a different openflow installation ( or just a different user ) using

Set-CurrentUser -Username Allan -Password SuperSecret -WSURL wss://

and then “restore” the nodered objects into the new openflow using

Set-Entity -Objects $nodered -Collection nodered

As I mentioned above, for people with an premium license, I’m almost done with a solution that makes all of this super easy, by combining the git server with a snapshot/restore system. In that case you simply clone and push a git repository to move between environments, and get the benefit of full version control across versions.


Lots of appreciation for the thorough answers. I’ll soon notify you of our progress

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.