Hello,
We are implementing a hub and spoke architecture for a client where individual Ignition! systems will be operating across North America. These Ignition! systems will have a local Postgres DB installed on their servers. We are trying to come up with a solution on how to replicate the following DB tables in real time in our hub environment (Azure).
Table 1: Event data, sparsely inserted every ~10 minutes.
Table 2: Tag Values, Inserted every second as a new row (~300 columns).
Once up and running, we want the information stored in each local DB to also be sent to our Hub environment and stored in a 'Hot' Postgres cluster. Table 1 will accumulate into a single table, and we think the easiest solution is to replicate Table 2 individually in the Hub in separate tables.
These Hub tables then need to be replicated into 'Cold' storage either in long-term Postgres storage or transferred to the client's Data Lake, ideally within 30 seconds from when Ignition! in the field reads the sensor data.
If there are any ideas on possible ways to accomplish this, that would be great.