Tag history query

We have 8 fish gutting machines, and are to create a status page for them.
Each of these machines have an internal counter counting the number of fish passed through. I have set up this counting value as an SQL-tag and i have enabled history for it.

Now, i want to display the number of gutted fish for a given period of time (set by a date slider) in a bar chart.
For this i would need to return the difference between the highest value and the lowest value in the selected range for each machine.

I have tried creating a script module using the “system.tag.queryTagHistory”, but i am having two problems:

  • When setting the return size of “system.tag.queryTagHistory” to “2” and the aggregation mode to “Min/Max”, this will return a random number of rows (and always more than two). Why?
  • How do i pass the “DateTime”-format to the python date format?

Does anyone have example code for a similar application? Is there other / simpler ways of doing this?

A return size of 2 means you want 2 timestamps/datapoints regardless of how long your time range is. The Min/Max aggregation mode meas that for each timestamp you get in the result set, if there are more than one pieces of data in that time range it will return both the min and max values (data values, not timestamps).

You are better off using the On Change aggregation mode and writing a propertyChange script to check the value of the first and last records manually.