Persistent data storage in Python

I am working on a system that will need to manage several dozen different automation processes. Each process controls a part of the system – runs a motor, moves a pump, opens a valve until a tank fills to a certain level, etc. Each process can be active or inactive at a given time, or blocked because it is waiting to run but another process of higher priority is running.

I would like to create a script that runs periodically (say every 300ms) and checks on and modifies the state of each process: starts it if it needs to be started, changes the state of some of the components it controls if necessary, stops it if its work is finished, blocks it if another process of higher priority needs to be launched… in short, something like an RTOS except not particularly real-time, since latency is not important in this application. I need a persistent way to store information about the state of each process - whether it’s active/blocked/inactive, what step it is on, its priority, etc. I suppose I could store this information in tags but it seems ugly and hard to manage. I would much rather have a Python class for each process.

I could have an Ignition startup script that sets up all the processes, then sleeps and wakes up every 300ms to check on them. Is there a disadvantage to doing that? It seems cleaner to use a timer script, but then I see no way to have persistent access to my data structures. I am also open to suggestions on an entirely different way of achieving the process management I’m looking for.

Thank you!

Jack

Create a shared gateway script with the function you want to run regularly. Outside of the function, assign an empty dictionary to a “global state” variable. Never assign to that name anywhere else. Set key/value pairs in the dictionary to the process class objects you need. Call the function from a runScript() expression tag that has a 300ms scan class. Return a dataset with enough information to reconstruct the class objects when the script is edited.