Is it possible to fully automate a new gateway’s configuration.
Device boots up with Ignition installed. Given the devices IP address, can we remotely trigger configuration of the new gateway to do the following, at minimum? Specifically Ignition Edge in our case, but automating config of any gateway ideal.
Join gateway network
Install EAM agent
Install modules (I think eam does this)
Configure modules (specifically mqtt transmission)
The goal being that we have a single place we go to “build” a new facility and using automation the field tech just has to enter some bits of info after booting a new device and the system brings it online.
I’m not sure if I fully understand. But you can automated tasks in a bash or batch file depending of OS with scripts. A good way to trigger those script is with util.execute in a Gateway Start Script.
Those scripts should interact with the Ignition gwcmd.
Some of those things need to be done manually due to security I think.
I think you can achieve most of this… For Gateway Name, Gateway Network, and EAM Configuration, this can be done by pre-loading the
init.properties file. See Automated Agent and Controller Installation in the docs. For modules, you can drop the
.modl file into
<ignition install location>/user-lib/modules. Past that, you will need to accept the module license and certificate–the best path here is probably to restore a baseline gateway backup (taken after you’ve installed/configured this module normally) using the gwcmd utility. My approach would be to package this all up in an Ansible playbook that you could use to automate deployment of all of these customizations through a remote SSH connection (across one/more devices).
Disclaimer: the above assumes a Linux device, might be a little trickier against a Windows machine but possible (probably just need to use different automation tools to get it done).
@Kevin.Collins thanks for the response. I had a feeling you would know the answer to these questions! I think you covered it well and Ansible is my plan, just needed a push down the right road. The new scripting features in the mqtt modules should help me get the module configured with the transmitters.
I’ll dig into the documentation you linked.
Ansible/Gitlab is the method that we have used to deploy our Ignition systems to manage projects, tags, and tag-udts. I’m curious if there is a method to utilize the init.properties file to include OPC and other device connections as well. It would come in handy so that we can have a documented and SCM layout or our plants infrastructure that we can re-deploy quickly in case anything majorly borks, but also encourage us to document instead of more pointing and clicking.
We are managing all our edge gateway configs in our Central (frontend) perspective project. You can create all the opc devices there and send the config via gateway network, then use message handler scripts at the edge to parse the config and apply it. Adding all your devices, tags, et cetera.
Ansible is a great tool for managing the edge devices.
Would love to hear more about the gitlab aspect.
The specific use case we have is that we want to keep our visualization clients, MES Runtime and Data/Historian on the IT side of our network. Our plan is to place the edge gateways on the OT side and utilize mqtt to pass the data from our plcs via distributor in a DMZ over to our IT side. This creates the issue that we’re not wanting to utilize the gateway network across our firewalls. Instead, I’d like to have a way that I can utilize ansible build the gateway config on demand with the setpoints, devices, etc stored as code in a git repo with ansible playbooks, group and host variable files. It also helps when we’re looking at building new facilities to be able to plan and set up the playbooks in advance of the devices being available.
I have implemented curl payloads to do gateway commissioning on install but taking that further than the minimum viable product is way more work than it is worth with how often that changed during 8.0 to do an entire gateway configuration across our 40+ servers which are not always on the same version due to the various project status’.