Is it possible to Host ignition on Azure virtual machine

Has anyone done this? How to configure DNS and firewall settings
To allow access to Web port 8088
Would appreciate any help. Thanks

Any new on this?

To save anyone else the trouble of finding this post and then needing to just figure it out, yes it’s possible to host Ignition on an Azure virtual machine. To access your gateway externally, you need to allow port 8088 on both the networking settings in Azure and the firewall settings in Windows (assuming you’re using a Windows VM).

To allow the port on Windows Defender Firewall on the VM itself:

  1. Open Windows Defender Firewall and go to “Advanced settings”
  2. Click on “Inbound Rules” on the left menu then “New Rule…” on the right menu (Actions)
  3. Create a “Port” rule that allows TCP connections on 8088

To allow the port in Azure:

  1. Access your VM resource in the web interface for Azure
  2. Go to the Networking tab under Settings
  3. Go to “Inbound port rules” and click “Add inbound port rule”
  4. Set “Destination port ranges” to be 8088

You should be able to see your public IP address from the networking tab, and once you’ve set those rules you should be able to access the gateway externally at <public ip address>:8088.

Note: This is my first time dealing with cloud computing, so apologies if anything is unclear/imperfect. This is what got it working for me though. Good luck!

1 Like

Hi @catherine.fowler, how is the performance? and could you please share your VM version/ instance details if possible.

Thanks

Hi @srb, I don’t have it fully operational yet (the project is still in development), but from simulated data and a trial running on the cloud the performance seems perfectly reasonable. I have found that reading from remote tag providers can be extremely slow, so if you base a lot of your display dynamically on tags, I’d recommend either making a copy on the cloud version or otherwise migrating the tag provider to be hosted locally on the cloud. For example, I had overview screens that would read the documentation, engineering units, and value for a large number of tags in order to form the display, but reading all of that from the remote tag provider made the overview incredibly slow to render. I found I needed to make a copy of the tag structure with tags that could store the documentation and engineering units on the cloud gateway to make the overview screens load in a reasonable period of time.

As for the Azure VM, again, everything is still in development so I’m on a free trial and using just a standard D2s v3 (2 vcpus, 8 GiB memory). The specs you’ll need for your VM are going to be dependent on how big your application is/how much you need it to handle though, so you’ll need to adjust your resources accordingly.

Hope this helps!

Thanks for the details @catherine.fowler. I was also facing the issue where reading tag data was taking 2 seconds per tag from a remote tag provider. I had to make some changes to my screens and scripts to make it perform better and so far with live data, performance has been nominal.

You should keep in mind that PLC protocols are both “chatty” and sensitive to latency. Which means you really don’t want to be using a cloud VM of any kind to be your OPC Server. Ignition Edge or a stripped down Ignition with just drivers should be in each plant. Or use MQTT for all data collection.