Valid data point calculation from SQL

Hi,

Assuming you’re using 7.8, this sounds like a perfect use case for a feature that hasn’t really been publicized yet (we were waiting for 7.8.1): Custom python aggregate functions.

You wouldn’t have known about this, since we have rolled the documentation into the user manual yet, but we will soon. In the mean time, I’ll post a link to this prelim documentation:
Custom Python Aggregates

Basically, you can write calculation functions in python, and use them from system.tag.queryTagHistory and queryTagCalculations. Unfortunately there isn’t UI support for these functions at the current time, we’ll have to get that soon. However, one way or another it’s usually possible to work your way around that.

The functions can be defined as shared scripts, or the text of the function can be passed in. I’ll show some version of both in the following examples.

First, I created a script in the shared script library called ‘aggregates’. I defined it like this:

def isValid(qval, lowrange, highrange):
	if qval is not None and qval.quality.isGood():
		val = float(qval.value)
		return val>=lowrange and val<=highrange
	return False
	
def percentValid(qval, interpolated, finished, blockContext, queryContext, lowrange=0.0, highrange=100.0):	
	validTimeCount = long(blockContext.getOrDefault('validTimeCount',0))
	lastValidTime = long(blockContext.getOrDefault('lastValidTime', 0))
	wasValid = blockContext.getOrDefault('wasValid', False)	
	if isValid(qval, lowrange, highrange):
		if wasValid:
			validTimeCount = validTimeCount + (qval.timestamp.getTime() - lastValidTime)
			blockContext['validTimeCount'] = validTimeCount
		blockContext['wasValid'] = True
		blockContext['lastValidTime'] = qval.timestamp.getTime()
	else:
		blockContext['wasValid'] = False
	
	if finished:	
		return float(validTimeCount)/float(blockContext.blockEnd - blockContext.blockStart)	

Quick Note: notice that I have added two extra parameters to the defined function. I figured you wouldn’t want to hard code the range… if you do, great. But in the second part of the example I’ll show you how to pass those in. It’s ok to add them to the function defined by the spec because I’ve given them default values, so python won’t care if they don’t get passed in.

To use/test that function, I have a sine wave tag logging history, and I put a table and button on a screen. I’ll execute the calculations and put the data in the table. My Sine tag goes “-100/100”, so you can see, with my default allowed value, I should get “0.5” over the last hour:

table = event.source.parent.getComponent('Table')
func = 'shared.aggregates.percentValid'
table.data = system.tag.queryTagCalculations(paths=['[History]Sine'],calculations=['Average', func], rangeHours=-1)

This gives me the time-weighted average and my custom calculation.

Now, if you wanted to pass a custom validity range, you could wrap up the shared script, with a locally defined script. In this example I’ve added two Numeric Text Fields to the window, and I work their values into the wrapper.

table = event.source.parent.getComponent('AggTable')
lowBound = event.source.parent.getComponent('LowBound').doubleValue
highBound = event.source.parent.getComponent('HighBound').doubleValue

wrapper = """python:def wrapper(qval, interpolated, finished, blockContext, queryContext):
	return shared.aggregates.percentValid(qval, interpolated, finished, blockContext, queryContext, %s, %s)""" % (lowBound, highBound)
table.data = system.tag.queryTagCalculations(paths=['[History]Sine'],calculations=['Average', wrapper], rangeHours=-1)

I know this is quite a bit to process, but I hope it’s fairly clear. Please feel free to let me know if anything isn’t clear. There’s unfortunately a bit of trial and error involved with writing these functions, and the logging functions on queryContext are useful for that. In working up this example, I personally ran into various problems with types and type conversions, which is why you see my liberal use of “long()” in the function I defined.


If you’re not using 7.8, or don’t want to go this route, the simpler suggestion is to log your data with the SQL Bridge into a format that more easily allows you to use SQL to get the calculations you want.

Regards,