Count number of tags with value x and add to a dataset

I'd put everything in one dataset, then filter it depending on user interaction. You can even simulate tabs if that's the look you're after.

I'll suggest one way of doing it, which may not be the best way, but that's a start at least.
I expect there's a more performant way of doing this with only epxressions if you're willing to install @pturmel's integration toolkit module.

step 1: read everything and make it a dataset
In your script library, add a python function like this (more about this later):

def build_dataset():
	paths = ["machine {}".format(n) for n in xrange(100, 201)]
	values = [qval.value for qval in system.tag.readBlocking(paths)]
	return system.dataset.toDataSet(
		['tag_path', 'value'],
		[[path, value] for path, value in izip(paths, values)]
	)

How exactly to build the paths will depend on... the paths. This should make 100 paths in the format "machine n" for all n between 100 and 200 (included)

step 2: pull the dataset to your window/component/whatever:
Add a custom property to your window, let's call it "full_data", of type dataset.
Add an expression binding to it, and use this expression: runScript('path/to/build_dataset', 5000)
You'll need to adapt the path of the function (you can right click a function name in the right part of the script library editor to copy its path).
The "5000" here means the script will be executed every 5 seconds. Change this to match the refresh rate you want. 0 will make it run just once.

step 3: make another function to filter your dataset
Something like this:

def filter_dataset(ds, values):
	data = system.dataset.toPyDataSet(ds)
	return system.dataset.toDataSet(
		list(ds.columnNames),
		[row for row in data if row['value'] in values]
	)

This will return a new dataset with only row where the column value matches an element in the values list passed to the function. This means that calling filter_dataset(ds, [0, 1, 2]) will return the full dataset in your case.
More about this later as well.

step 4: use this function to populate your template repeater
Now all that's left is to add a repeater to your window, some way to allow the user to pick a status (dropdown, buttons... whatever), and bind the repeater's data to both the full_data custom property we made earlier and the user selected status.
Use an expression binding again, with runScript('filter_dataset, 0, {path/to/full_data}, {path/to/selected_status}')
I don't know whether runScript expressions reevaluate when one of the parameters change. If they don't, you'll need to change the pollrate to something that's not 0.

That's it.

Now, about the functions. Those will only be usable in this use case... which is a bit of a shame. If you ever find yourself in a similar solution, you'll need new very similar functions.
So, I'd suggest putting them in a custom library that contains reusable, generic functions.
The first one could be something like:

def tags_to_dataset(paths, columns_names):
	values = [qval.value for qval in system.tag.readBlocking(paths)]
	return system.dataset.toDataSet(
		columns_names,
		[[path, value] for path, value in izip(paths, values)]
	)

Which means you'll need to build your paths list separately. Which, at first sight, adds a layer of complexity, but is actually simpler in the long run.
Just make a new function specifically for that use case, which will build the paths list then call tags_to_dataset:

def build_dataset():
	paths = ["machine {}".format(n) for n in xrange(100, 201)]
	return tags_to_dataset(paths, ["tags paths", "values"])

The filter_dataset function can go through a similar process.
The simplest way to make it more generic is to get rid of the hardcoded column name to filter on:

def filter_dataset(ds, values, col_name):
	data = system.dataset.toPyDataSet(ds)
	return system.dataset.toDataSet(
		list(ds.columnNames),
		[row for row in data if row[col_name] in values]
	)

And change the expression binding to `runScript("path/to/filter_dataset", 0, {path/to/data}, {path/to/selected_status}, 'values').
Another way would be to instead pass a filtering function, that would be called to filter the data. But let's not go there right now.

Last thing to note: Those are simple and naive functions, and it would probably be a good idea to make them more resistant to failure.
What happens if some argument is not of the right type ? If a dataset is empty ? if... ?
But this is also a discussion for another time. For now I need to go and call my grandmother, she's celebrating her 99th year on earth Today.

2 Likes