I’m gonna try and generalize my quesition so the answers will be useful for more than just me.
The situation I’m facing is this: I have a machine, from which I can extract the current production count. I’ve enabled the historian on this tag, but I’m struggling to figure out how I can readily calculate the production rate for the timeframe in a way that doesn’t seem fragile, prone to poor performance and hard to comprehend for maintenance.
The current version I’m looking at basically has a script querying the tag historian for data in the timeframe and then selecting the largest from the resulting dataset. Then it’s querying for the calculated minimum in the window and comparing the values to get production over the timeframe, then that value is divided by the number time-units in the time frame(e.g. minutes in the hour) to get production per unit for the timeframe.
This script seems to be running on a timer in order to refresh regularly, Given fact that the count updates in near-realtime, it feels fundamentally wrong to be doing this chain of queries and slicing datasets in order to get such a basic value.
I’m given to think I might be missing something, but so far I can’t find a great alternative.
Even my attempts to condense the queries by looking for the maximum and minimum counts in the timeframe into one queryTagCalculations call, but the maximum value is significantly smaller than the highest value I get if I look up all the values in the timeframe and select the largest myself.
So the question is basically this: How do you do this without running a hefty script on every change and/or tick on the timer? Can this value be calculated when the count updates, without being too heavy to safely run at that kind of speed?