I have tags that are calculating throughput from individual ports on network switches (through Kymera SNMP Driver). The way to calculate throughput is to take the difference of ifInOctets over a period of time and the difference between ifOutOctets over the same period of time, multiply by 800, and divide by the speed of the port and the time difference. Both ifInOctets and ifOutOctets are in the form of OPC tags.
I achieved this by making a throughput tag that runs a script from the Project Library. That script queries history for both ifInOctets and ifOutOctets gets the difference in values and returns the calculated throughput. I’m starting to doubt that this is the best practice however, because with 20+ switches, 20+ ports each, the number of scripts running and queries executing against the historian every second is quite big.
The only way that I can think of of how to do it differently is to have a temporary tag that lags 1s behind the OPC tag, and just read that tag rather than query the historian. I’m not sure how to do this properly or reliably however.
Has anyone ever dealt with tags that are calculating the derivative of other tags?
What would be the best practice for doing this? I would appreciate advice.