I’m working with a system that has tag history configured consistently in terms of the sample mode and the min and max time between samples. When using the system.tag.queryTagHistory() function, I’m trying to extract the maximum value over the course of a year for a few of these tags. Some tags will return this value in about ~2 seconds. Other tags will time out even when setting the timeout parameter in the queryTagHistory function to 10 minutes. All tags are in the same history provider and have a similar number of raw data points over the time range. Is there a good explanation for why we might see drastically different performance in returning these values when everything else is equal?
How are you determining a similar number of raw data points between tags?