queryTagHistory returning 0 for timestamps with "On Change" setting

I have some tags that are being logged with the “On Change” sampling mode with a minimum sample time of 1s. As we would need the data for each tag at every second, the tag history was pulled into a CSV file with the use of the queryTagHistory() function with a returnSize=86400 and noInterpolation=True.

However, we found that values that for timestamps, the tags are recorded to have a value of 0. We believe that at the timestamps where no change had occurred, the function stores the tag value as 0 rather than the last value. For example, if the tag has a value of 56 at 10:20am up till a change to 58 at 10:25am, all data extracted between these two timestamps would be treated as 0 (e.g. 10:20am : 56,10:21am : 0, 10:22am :0 … 10:25am : 58)

How do I store the last value (i.e. 56) for the timestamps in between the timestamps where the change occured? Thanks!

It hasn’t stored zeros. It is supplying zeros because you’ve asked for something it can’t do–provide a fixed number of samples without interpolation. Use sample size = 0 and format = raw with your desired time frame to get what was actually recorded. The you will have to post-process to get a dataset with one row per second.

Or use the correct tool for this type of recording: a transaction group set to one second.

2 Likes

Phil,

Is this still valid? I don't see anything in the docs for setting format=raw, and sample size, I'm assuming it is returnSize? I've been using returnSize=0 and aggregationMode=LastValue - is this the best way to retrieve the pure data as it is in the database?

i'm not precisely sure, because I avoid these features of the tag historian. Please report what you find.

{ For anything that isn't purely an analog value independent of any other, I will continue using transaction groups or scripted equivalents. }