Is it possible to make a view that will display the most recently updated tags in order of when they were updated?
Edit: For changed configuration
Yes. You'll need to have auditing turned on, and then you can just run a query on the audit log, filtering to tag edit
actions.
E.g. (for MS SQL Server)
SELECT TOP 100
*
FROM
AUDIT_EVENTS
WHERE
ACTION = 'tag edit'
ORDER BY EVENT_TIMESTAMP DESC
I think he meant updates of values, not edits.
I guess you could use the historian and fetch the n last historized values ?
Depending on what tags you want to monitor (all of them ? Some specific subset ?), You could add a table in your database, and a "tag change" gateway event. Add the tags your want to monitor, and script an insert into the database. Prune regularly.
Or use a memory dataset tag instead of a db table.
Ah, that might make more sense. In that case, I don't know the best a good method...
This wouldn't work for unhistorised tags
There is a timestamp in the Qualified tag values... (Altho this value might reset if gateway restarts)
With browse you could go over all of them. This works, but there is probably a more efficient method.
(example readblocking them all at once)
Im sure @pascal.fragnoud will improve this
browsedTags = system.tag.browse(path = '[default]', filter = {})
def getTime(tagName):
return system.tag.readBlocking(tagName)[0].getTimestamp()
sortedPaths = sorted([str(x['fullPath']) for x in browsedTags.getResults()], key=getTime, reverse=True)
Keep in mind that the timestamps Ignition gets from an OPC UA server, or generates in its own drivers, is almost always the timestamp of the poll, when the change was detected by OPC, not the actual timestamp when the signal changed in the real device. This means the ordering will be pseudo-random for signals that changed close in time, less than the poll interval.
If you really need sequence of events, you must timestamp precisely in the PLC and then buffer transfer to Ignition.
Here's how I'd write that:
paths = [str(tag['fullPath']) for tag in system.tag.browse(path='[default]', filters={'recursive': True})]
sorted_tags = sorted(system.tag.readBlocking(paths), key=lambda qv: qv.timestamp, reverse=True)
Mainly to avoid reading tags individually, but I'd probably stick with a gateway event tag change script to maintain a list of the n most recent tags. I wouldn't want to have to browse a full tag provider, then sort all of them, just to get a few of them.
Pretty sure it was fullPath
. Otherwise much better than mine
And yeah i wouldnt really do this either, but you can filter some more on the browse i guess if its only a couple of tags you want to do
It is, I mistyped it.
Hi all, Ive come up with a solution
First ive created these global functions
This function creates a list of all tags in my project
*def browse_tags_recursive(path):*
* # Browse the tags at the given path*
* results = system.tag.browse(path)*
* tag_list = []*
* for result in results.getResults():*
* # If the result is a folder, recurse into it*
* if result['hasChildren']:*
* sub_folder_tags = browse_tags_recursive(result['fullPath'])*
* tag_list.extend(sub_folder_tags)*
* else:*
* # If the result is a tag, add it to the list*
* tag_list.append(result['fullPath'])*
* *
* return tag_list*
this function sorts the dataset by date
*def sort_dataset_by_column(dataset, column_index, ascending=True):*
* # Extract rows as lists*
* rows = []*
* for i in range(dataset.getRowCount()):*
* row = []*
* for j in range(dataset.getColumnCount()):*
* row.append(dataset.getValueAt(i, j))*
* rows.append(row)*
* *
* # Sort the rows based on the specified column index*
* sorted_rows = sorted(rows, key=lambda x: x[column_index], reverse=not ascending)*
* *
* # Rebuild the dataset with sorted rows*
* headers = [dataset.getColumnName(i) for i in range(dataset.getColumnCount())]*
* sorted_dataset = system.dataset.toDataSet(headers, sorted_rows)*
* *
* return sorted_dataset*
this funtion trims the historian data to only display changes
*def filter_dataset_on_column_change(dataset, column_index):*
* # Initialize an empty list to store the filtered rows*
* filtered_rows = []*
* *
* # Get the value of the first row's specified column*
* previous_value = None*
* *
* # Iterate through the dataset rows*
* for i in range(dataset.getRowCount()):*
* current_value = dataset.getValueAt(i, column_index)*
* *
* # If the current value is different from the previous value, keep the row*
* if current_value != previous_value:*
* row = [dataset.getValueAt(i, col) for col in range(dataset.getColumnCount())]*
* filtered_rows.append(row)*
* *
* # Update the previous_value to the current value*
* previous_value = current_value*
* *
* # Rebuild the dataset with only the filtered rows*
* headers = [dataset.getColumnName(i) for i in range(dataset.getColumnCount())]*
* filtered_dataset = system.dataset.toDataSet(headers, filtered_rows)*
* *
* return filtered_dataset*
I then use this script attached to a refresh button to generate a dataset which i display in an table.
*def runAction(self, event):*
* # Start browsing from the root of the default provider*
* all_tags = myFuncs.browse_tags_recursive('[default]')*
* datasets=[]*
* endTime = system.date.now()*
* startTime = system.date.addMinutes(endTime, -30)*
* #poll historian for each tags recent data*
* for tag_path in all_tags:*
* data =system.tag.queryTagHistory(*
* paths=[tag_path],*
* returnFormat="Tall", *
* aggregationMode="MinMax",*
* returnSize=0,*
* startDate=startTime, *
* endDate=endTime*
* )*
* filtered_data = myFuncs.filter_dataset_on_column_change(data, 1)*
* if filtered_data.getRowCount()>0:*
* datasets.append(filtered_data)*
* if datasets:*
* combined_dataset = datasets[0]*
* for ds in datasets[1:]:*
* combined_dataset = system.dataset.appendDataset(combined_dataset, ds)*
* *
* # Example: Ordering the combined dataset by the 4th column (most recent date first)*
* ordered_dataset = myFuncs.sort_dataset_by_column(combined_dataset, 3, ascending=False)*
* self.view.custom.dataset=ordered_dataset*
This appears to work - ill be looking into making a similar function that does not use the tag historian
This works but wont display if a tag eg. Gate_01_state changes multiple times and will only show the most recent change
Is this avoidable?
Don't do this, system.tag.browse
has a recursive
option within its filter
arg for this. Recursive browsing manually is magnitudes slower.
e.g.
system.tag.browse(..., filter={'recursive': True, ...})
It looks like you can create a Gateway Tag Change script using a tag path with wildcards, so you could use this perhaps... no idea how this would scale...
e.g.
Tag paths: [default]*
If you're adding these changes into a dataset tag, you most certainly will need to manage this, otherwise you absolutely will end up overwriting the dataset with missing added values if you simply try to read the dataset and write back to it with the inserted value. I would probably keep the data as a list of dicts in memory in a script library and write that into the dataset tag periodically.
No, not on its own. I wasn't suggesting you do this. My suggesting was, and still is, that you use a gateway event to monitor tag changes and maintain a dataset tag or the most recent changes.
It would look something like this
history = system.tag.readBlocking(["path/to/dataset_tag"])[0].value
if history is None:
history = system.dataset.toDataSet(['tag_path', 'value', 'timestamp'], [])
history = system.dataset.addRow(history, 0, [event.tagPath, newValue.value, newValue.timestamp])
if history.rowCount > MAX_ROWS:
history = system.dataset.deleteRow(history, MAX_ROW_COUNT)
system.tag.writeBlocking(["path/to/dataset_tag"], [history])
Every time one of the monitored tag changes, it will add its path, value and timestamp at the top of the dataset, then if there are too many rows, remove the last one.
Haven't tried to run it, so there might be typos and such, but that's pretty much what I had in mind.
Now all you have to do is bind a table to that tag and you'll have the data you want.
Note that if your tags change very quickly, or if you have a LOT of tags, then I'd switch to a timer script that would run every 5 or 10 secs, pull the data from the dataset, make a tag browse, sort them by timestamp, then pop values from both the history dataset and the tags, taking the most recent one, to put it into a new dataset.
something like
tags_paths = [tag['fullPath'] for tag in system.tag.browse('', filters={'recursive': True})]
tags = system.tag.readBlocking([HISTORY_PATH] + tags_paths)
history = system.tag.toPyDataSet(tags[0].value)
tags = sorted(zip(tags[1:], tags_paths), key=lambda v: v[0].timestamp)
new_history = []
for i in xrange(MAX_NUM_OF_ROWS):
if len(history) > 0 and len(tags) > 0 and history[0]['timestamp'] > tags[0][0].timestamp:
new_history.append(history.pop(0))
elif len(tags) > 0:
tag = history.pop(0)
new_history.append([tag[1], tag[0].value, tag[0].timestamp])
new_history = system.dataset.toDataSet(['tag_path', 'value', 'timestamp'], new_history)
system.tag.writeBlocking([HISTORY_PATH], [new_history])
Completely untested and absolutely optimizable (use queue types for starters), so treat this like pseudo code to get you started.
This won't work if multiple tags change at once Edit: if these are in tag change events, as the dataset tag will be read for all potentially at the same time and then the new row will be added into the local variable, and then this will be written to the dataset tag for each change, but they will all overwrite each other in the process. Only the last to execute will actually make it into the dataset
Thanks
I used this in combination with a query to the alarm journal to produce a sequence of events that captures all tag changes and all alarm changes
Are you sure about that ? I thought gateway events were queued.
Ah, I think you're right. I had tag change scripts on the brain!