I've searched and have seen several references to issues with both TagCalculations and TagHistory aggregation for minimum and maximum not working. I've created a simple slider to adjust a date/time inspection range that changes the query and an accompanying easy chart to show the data. Something is not making sense. The associated code:
rootpath = event.source.parent.Analog1_UDT.Meta.TagPath
paths = [rootpath + "/ProcessValue"]
endTime = system.date.now()
minutes = -1*event.source.parent.getComponent('Range Slider').value
startTime = system.date.addMinutes(endTime, minutes)
data1 = system.tag.queryTagHistory(paths, startTime, endTime, aggregationMode="Minimum",returnSize=1,ignoreBadQuality=True)
event.source.parent.getComponent('Analog 2 Min Ind').Value = data1.getValueAt(0,1)
data2 = system.tag.queryTagHistory(paths, startTime, endTime, aggregationMode="Average",returnSize=1,ignoreBadQuality=True)
event.source.parent.getComponent('Analog 2 Avg Ind').Value = data2.getValueAt(0,1)
data3 = system.tag.queryTagHistory(paths, startTime, endTime, aggregationMode="Maximum",returnSize=1,ignoreBadQuality=True)
event.source.parent.getComponent('Analog 2 Max Ind').Value = data3.getValueAt(0,1)
event.source.parent.getComponent('Easy Chart').startDate = startTime
event.source.parent.getComponent('Easy Chart').endDate = endTime
If I adjust the chart to show a min/max/min, then the max works.
If I adjust the chart to show a min/max, then the max does not work.
Max is the bottom indicator
Any thoughts?
As a note, you can see that Average changes accordingly, and appears to be correct, even though min and max are both at 200.
Revisiting this....In Ignition 8.1.38, still having this problem. Is there a better way to calculate the Min and Max over a range? The Average value is correct, but Min and Max seem to be affected by where the data point was recorded by Ignition.
All tags are set up as On change, absolute dead band .5, 10 second minimum.
It appears that if the interrogated range includes data that was static at the start of the range, then that data point is ignored in the minimum calculation, but included in the average calculation.
data recorded at 8:50AM Value 0
data recorded at 9:36AM Value 0
data recorded at 9:41AM Value 100
Minimum Aggregate from Time 9:38 to 10:08 shows Minimum value of 100, not 0. The Aggregate seems to move forward in time to find the first value entry as if it is using Natural interrogation.
Update: Solved.
The ReturnSize set to 0 in the query, seemed to solve it for the Minimum value only. Keeping Max to ReturnSize 1 was still necessary.
The 8.1 documentation seems to be in error:
"The number of samples to return. -1 will return values as they changed, and 0 will return the "natural" number of values based on the logging rates of the scan class(es) involved. -1 is the default. [optional]"
Setting Return Size to -1 creates a script execution error. A value of 1, does not.
Probably because it expect no aggregation. And if used with wide return format, produces many duplicate values as every unique timestamp gets its own row.