Thx for the reply Zac!
The values are indeed from a live industrial system but it’s a remote system, users are not controlling it, and snapshots update every ~5 minutes. I have a crontab running to fetch and insert this bulk data into a large, minimally structured table in SQL as an overall larger external data collection effort.
I’m using the transaction group primarily to transform a subset of this bulk data from horizontal to vertical, triggered on a timestamp value change, and stored for a limited amount of time into the past (using “Delete records older than…”). So yes, the bulk table sees limited traversal to construct this subset of transposed data into a smaller “helper” table of maybe the last week or month’s worth of data, as compared to… multiple vision components per window (from which users could potentially ask for the moon) or query tags hitting the bulk table every 5, 10, or 15 seconds filling up my historian with duplicate and repeating data, not knowing exactly when the new 5-min data will arrive, in a costly effort to keep things snappy on the client side.
The end goal is for each client of many to be able to sort, reorder columns, and time slice vertical data feeding a set of stacked bar charts. I’m trying to create the effect of a synchronized time-series San Key diagram that each user can shuffle according to their own objectives asynchronously, without flooding the database with chrontab-like queries. Also, when a chart updates, it re-initializes the component potentially interrupting the user’s analysis every 5, 10, or 15 seconds.
I actually started out with this approach (mostly because I couldn’t get a transaction group to work with only expression tags that only existed in the group) Ultimately, I need to write down to components’ datasets in Vision, and give the user some ability to individually manipulate that dataset, and control whether the data is auto updating. It goes to my initial reply, building a dataset from the ground up with a transaction group/individual tags. I’m experimenting with the DatasetBuilder library that Phil has suggested in many of his posts, it seems extendable enough. I just need to sharpen my python looping and data structure skills, go buy some, or study his example closely.
For the time being, I’m settling for a client memory dataset tag that uses a set of client tags, as fed from user inputs to manipulate a SQL query to the helper table, with its polling turned off. When a user changes their input values, it fires another query. Just need to figure out how to fire that same query when new data arrives in the helper table…prolly an client tag event “value change” script triggered from the same gateway tag that’s triggering the transaction group, and takes the last select query and runs it again.
Does it ever end? Sure seems like there should be a short cut.