Synchronising databases between HMIs and a Central Ignition Server

I'm in a conundrum.
I need to have a central Ignition gateway which will run a client on 5 machines 99% of the time. The remaining 1% of time, there's the potential that the clients will run from their own local fallback gateway due to network failure or some other issue.

The Ignition project relies heavily on a database and its tables to run, and each machine and the central Ignition will all need SQL installed along with copies of the required tables. The tables import job files which are essentially just CSVs with a part to produce on each row. Each job file imported adds a record into the cfg_jobs table and a record for each part into the cfg_job_parts table (the parts table contains machine instructions to produce the part itself), where the parts have a job_id to link it to the job.

The project logs each part into another table (dat_part_roll_log) when a part is produced.

All tables have an id field which is an auto-incrementing identity. Some have foreign keys linking them to other tables. Schema below:

The issue is, they need to be able to import new jobs and log parts while they are running locally on each machine and while they don't have access to the central database/Ignition. I'm not sure of the best way to synchronise all of these independent databases together.

My concerns are:

  • the id fields in the cfg_jobs and cfg_job_parts tables must be synch'd between all of the different databases. If multiple local systems import a new job however, it'll auto-increment the ids and thus there will be duplicate ids across the systems pointing to different jobs/parts.
  • same thing for log tables
  • I don't know how to synch changes between the different systems - it sounds like a nightmare for both engineering effort as well as performance

Any thoughts and ideas would be much appreciated!

Thinking aloud...
One idea is to instead use a text id field for cfg_jobs set to the job's name. I can be guaranteed that this will be identical wherever the jobs are imported. The log table itself doesn't need to have the same ids across all databases, and only the central database needs to store the logs from all machines - the machines don't need to store logs from other machines.

What would be the best way to synch the changes to other databases though? system.db.runSFUpdateQuery? Some complicated SQL clustering? (I'm very new to database redundancy and any kind of clustering)

Sounds to me like a design flaw. I would make the local gateways (with databases) be in command of their production line all the time. Push all instructions down as soon as scheduled, report production back up the chain whenever the connection is good. Point the local clients at their local gateways all the time. They can open a new session to upstream to see global resources.

{ This isn't really a DB replication/redundancy situation, IMNSHO. }