Valid data point calculation from SQL

I have a customer that currently uses an Excel to log data through another SCADA package that I am replacing with IA.
The data they log is just simple temperatures and average that data at the end of the day (24-hour), but with that average, they also take the sample data collected, and if the value is within a certain range, it is considered “Valid Data” and then they calculate a percentage of valid data.
The tag is logged every 15 seconds.
In my report, I have the average working but I need some assistance on how to calculate the “Valid Data” points. The issue I am running into is how to validate that data since I am no longer logging to an Excel spreadsheet. It’s all being logged through the Ignition Tag Historian.


Assuming you’re using 7.8, this sounds like a perfect use case for a feature that hasn’t really been publicized yet (we were waiting for 7.8.1): Custom python aggregate functions.

You wouldn’t have known about this, since we have rolled the documentation into the user manual yet, but we will soon. In the mean time, I’ll post a link to this prelim documentation:
Custom Python Aggregates

Basically, you can write calculation functions in python, and use them from system.tag.queryTagHistory and queryTagCalculations. Unfortunately there isn’t UI support for these functions at the current time, we’ll have to get that soon. However, one way or another it’s usually possible to work your way around that.

The functions can be defined as shared scripts, or the text of the function can be passed in. I’ll show some version of both in the following examples.

First, I created a script in the shared script library called ‘aggregates’. I defined it like this:

def isValid(qval, lowrange, highrange):
	if qval is not None and qval.quality.isGood():
		val = float(qval.value)
		return val>=lowrange and val<=highrange
	return False
def percentValid(qval, interpolated, finished, blockContext, queryContext, lowrange=0.0, highrange=100.0):	
	validTimeCount = long(blockContext.getOrDefault('validTimeCount',0))
	lastValidTime = long(blockContext.getOrDefault('lastValidTime', 0))
	wasValid = blockContext.getOrDefault('wasValid', False)	
	if isValid(qval, lowrange, highrange):
		if wasValid:
			validTimeCount = validTimeCount + (qval.timestamp.getTime() - lastValidTime)
			blockContext['validTimeCount'] = validTimeCount
		blockContext['wasValid'] = True
		blockContext['lastValidTime'] = qval.timestamp.getTime()
		blockContext['wasValid'] = False
	if finished:	
		return float(validTimeCount)/float(blockContext.blockEnd - blockContext.blockStart)	

Quick Note: notice that I have added two extra parameters to the defined function. I figured you wouldn’t want to hard code the range… if you do, great. But in the second part of the example I’ll show you how to pass those in. It’s ok to add them to the function defined by the spec because I’ve given them default values, so python won’t care if they don’t get passed in.

To use/test that function, I have a sine wave tag logging history, and I put a table and button on a screen. I’ll execute the calculations and put the data in the table. My Sine tag goes “-100/100”, so you can see, with my default allowed value, I should get “0.5” over the last hour:

table = event.source.parent.getComponent('Table')
func = 'shared.aggregates.percentValid' = system.tag.queryTagCalculations(paths=['[History]Sine'],calculations=['Average', func], rangeHours=-1)

This gives me the time-weighted average and my custom calculation.

Now, if you wanted to pass a custom validity range, you could wrap up the shared script, with a locally defined script. In this example I’ve added two Numeric Text Fields to the window, and I work their values into the wrapper.

table = event.source.parent.getComponent('AggTable')
lowBound = event.source.parent.getComponent('LowBound').doubleValue
highBound = event.source.parent.getComponent('HighBound').doubleValue

wrapper = """python:def wrapper(qval, interpolated, finished, blockContext, queryContext):
	return shared.aggregates.percentValid(qval, interpolated, finished, blockContext, queryContext, %s, %s)""" % (lowBound, highBound) = system.tag.queryTagCalculations(paths=['[History]Sine'],calculations=['Average', wrapper], rangeHours=-1)

I know this is quite a bit to process, but I hope it’s fairly clear. Please feel free to let me know if anything isn’t clear. There’s unfortunately a bit of trial and error involved with writing these functions, and the logging functions on queryContext are useful for that. In working up this example, I personally ran into various problems with types and type conversions, which is why you see my liberal use of “long()” in the function I defined.

If you’re not using 7.8, or don’t want to go this route, the simpler suggestion is to log your data with the SQL Bridge into a format that more easily allows you to use SQL to get the calculations you want.


Thanks for sample code! I’ll try it as soon as I can get back to that project. I’ll update you on how it works for me.


New challenge, same issue.
I have a transaction group logging data every 15 seconds to a MySQL DB.
At midnight, I have a report scheduled to average out the data logged from that day but the customer still wants to see the “percent valid data points” that were logged, basically any value that is not 0 or ‘null’.
Is it possible to query the DB and count the number of data points in the DB for each tag being logged, then comparing that data point in the MySQL DB to be greater than ‘0’ and getting a “percent valid data” calculation all within the same query in the report module?

1 Like