Hub & Spoke Realtime Database Replication

Hello,

We are implementing a hub and spoke architecture for a client where individual Ignition! systems will be operating across North America. These Ignition! systems will have a local Postgres DB installed on their servers. We are trying to come up with a solution on how to replicate the following DB tables in real time in our hub environment (Azure).

Table 1: Event data, sparsely inserted every ~10 minutes.
Table 2: Tag Values, Inserted every second as a new row (~300 columns).

Once up and running, we want the information stored in each local DB to also be sent to our Hub environment and stored in a 'Hot' Postgres cluster. Table 1 will accumulate into a single table, and we think the easiest solution is to replicate Table 2 individually in the Hub in separate tables.

These Hub tables then need to be replicated into 'Cold' storage either in long-term Postgres storage or transferred to the client's Data Lake, ideally within 30 seconds from when Ignition! in the field reads the sensor data.

If there are any ideas on possible ways to accomplish this, that would be great.

I would use PostgreSQL Logical replication for such a case. Make sure your primary keys are UUIDs, to avoid clashes when merging from multiple sources.

Thank you for the suggestion, it looks like a solution that can achieve what we need to. Got to do some testing around encryption and compression as well as handling non-static IP but appreciate the help.