Using system.tag.getConfiguration and writing the output back to system.tag.configure (without any changes) will result in folders being duplicated, renamed “_types_“ and put underneath the existing folder. This seems to only affect folders in the UDT Definitions.
My goal is to mass edit UDT definitions for a project. I have done this before successfully with a project that did not have a folder in the UDT definitions.
Here is my UDT Definitions before running the script:
I found this topic System.tag.configure with UDTs in Folder, this sounds like my issue but it was not resolved. When using the workaround provided it corrupted my existing parameters and the original author was unsure why it worked. As this is from 5 years ago I started a this new topic.
There is a very good chance I have done something wrong or am not understanding how to use system.tag.configure correctly. Any help would be much appreciated.
I would be trying removing the _types_ folder from the cfg you get from getConfiguration (ie get just the tags array from the returned _types_ folder) , then setting the basePath in configure to the types folder so that you're writing the contents of the types folder back into the types folder.. If that makes sense. Hard on a mobile to make sense sometimes without being able to provide examples
Using system.tags.getConfiguration() and pushing the results into system.tag.configure() is NOT a "no-op". Getting configuration supplies default properties and data structures that aren't actually part of the as-stored configuration. Sending that back through will instantiate all of that and will have inevitable clashes.
Whatever you are actually trying to do needs to be boiled down to the minimum change and only that sent to system.tag.configure() (using merge mode).
I know this is correct for UDT instances, but from memory the cfg it returns for UDT definitions is exactly as stored can't confirm either way at the moment though…
Let me preface this by saying that I went down a rabbit hole to get something similar to work, and I’m sure there will be an edge case that I’m missing for a specific datatype. I would not use this in a production system, and I wouldn’t use it on a development system without having a backup.
EDIT: Forgot to mention that you do still need to remove the path keys to make sure you don’t get the extra _types_ folder(s). You also might need to run system.tag.configure twice if you have UDTs that are inheriting from each other that are being configured.
For the parameters, you’ll need to convert the datatype from a reference to the java datatype into a string representation of the datatype.
If you have any dataset tags in the UDT, then you probably need to convert the dataset into a json string representation with the type strings for java types not python types. This may only be necessary if it contains certain datatypes, but I don’t fully remember.
The code below is definitely not perfect, but if you want to go down this path, hopefully it helps point you in the right direction.
def convert_parameters(params):
# type: (dict) -> dict
''' Converts a parameter into a dictionary for system.tag.configure
Only place I know of that gives you this type is nested within the system.tag.getConfiguration output.
Args:
params (dict): parameter dictionary from a tag/UDT definition.
Returns:
dict: The modified dictionary of dictionaries
'''
return { key: {"dataType": str(value.datatype), "value": value.value} for key, value in params.items()}
def dataset_to_list(dataset):
# type: (Dataset) -> list[list]
"""Returns a list containing each row of the dataset.
Does not provide the headers list. You can use dataset.getColumnNames()
to get the headers list. Probably never need this. The two use cases I
was using it for have built-in functions in Ignition.
The little guy told me to leave it in just in case.
Args:
dataset (Dataset): Dataset to extract the rows from.
Returns:
list[list]: A list of lists. Each index is one row of the dataset.
Raises:
none
"""
new_table = []
for row in range(dataset.getRowCount()):
row_to_add = []
for column in range(dataset.getColumnCount()):
row_to_add.append(dataset.getValueAt(row, column))
new_table.append(row_to_add)
return new_table
def to_json_string(input_dataset):
# type: (Dataset) -> str
""" Converts the provided dataset to a json string. Specifically intended for use with system.tag.getConfiguration and system.tag.configure.
Datasets in UDT configurations can get a little wonky with the types.
This handles converting the type strings into java type strings so that system.tag.configure will properly create and update the dataset.
Args:
input_dataset (Dataset): dataset to jsonify.
Returns:
str: The json string representing the input_dataset.
"""
dictionary = {}
columns = []
for key in input_dataset.getColumnNames():
#print(key, type(value))
value = input_dataset.getValueAt(0, key)
if str(type(value)) == "<type 'bool'>":
new_type = 'java.lang.Boolean'
elif str(type(value)) == "<type 'unicode'>":
new_type = 'java.lang.String'
elif str(type(value)) == "<type 'java.awt.Color'>":
new_type = 'java.awt.Color'
elif str(type(value)) == "<type 'int'>":
new_type = 'java.lang.Integer'
columns.append({'name':key, 'type':new_type})
dictionary['columns'] = columns
dictionary['rows'] = dataset_to_list(input_dataset)
encoded = system.util.jsonEncode(dictionary)
return encoded
def convert_dataset(value):
if isinstance(value, BasicDataset):
if value.getRowCount() > 0:
return to_json_string(value)
return value
def replace_parameters(udt_configs):
# type: (list | dict) -> list | dict
''' Recursively replaces parameters dictionaries from system.tag.getConfiguration with proper dictionaries for system.tag.configure
Args:
udt_configs (list | dict): Usually the return from system.tag.getConfiguration that needs to be modified
Returns:
list | dict: The modified object
'''
# system.tag.getConfiguration returns a list, so call recursively if top level is list
if isinstance(udt_configs, list):
return [replace_parameters(item) for item in udt_configs]
elif isinstance(udt_configs, dict):
return {
# Replace parameters with converted dictionary compatible with system.tag.configure
key: convert_parameters(value)
if key == "parameters"
# Recursively call if dictionary or list to make sure we traverse the whole folder structure
else (replace_parameters(value) if isinstance(value, (dict, list)) else convert_dataset(value))
for key, value in udt_configs.items()
}
else:
return udt_configs
Thanks everyone.
Its a bit disappointing that these limitations are not mentioned in the documentation system.tag configure. The first example even shows editing multiple tags using the output from system.tag.getConfiguration.
Yes, I understand. Ill look into this but from @Clancy_Cavanaugh response it might be better to keep the folder and remove the paths. Ill test both methods.
I think it may be more of an issue with system.tag.getConfiguration() returns an output that isn’t able to be directly passed into system.tag.configure(). To the best of my knowledge this issue only exists when using it for UDT definitions.
I do agree it would be good to either fix or document this limitation.
I don’t know the history, but it seems like the type of issue that would come up if system.tag.getConfiguration() was created before UDTs existed in Ignition.
It turns out calling system.tag.exportTags with no filePath specified will allow you to save the output to a variable and it operates much like system.tag.getConfiguration does except it will specify parameters like {u'Device': {'dataType': 'String', 'value': u''}} where system.tag.getConfiguration will use {u'Device': {datatype=String, value=}} which requires conversion before it can be import again.
I can also confirm it:
Does not have the issue of creating an extra “_types_“ folders
It does not require the ‘paths’ key to be removed for use with system.tag.getConfiguration (as they don’t exist)
It has not messed up my existing parameters after import
The below test code can use either system.tag.configure or system.tag.importTags to bring the UDT definitions back into the provider. However unlike system.tag.exportTags the system.tag.importTags does expect a file path so its extra work to write the changes to a file before importing. It does feels like using system.tag.importTags is the correct function to use if most of the data was generated using the related function system.tag.exportTags. I assume these functions where designed to work with each other.
As @pturmel suggested the import data should only include the tags that have changes; however the test code does not do this yet. I also noticed when using system.tag.configure a merge would only work if the tags I was changing where at the same level of the basePath, if they where in a subfolder I would have to use override to get the change in.
Another benefit of using system.tag.exportTags is that it accepts a tagPaths variable rather than a basePath meaning you can target just the tags you want to edit. However if there are UDTs in different folders the folder needes to be added into the json (with the relevant tags under it) or you need to do multiple operations of the import one for each scope.
Test code:
def processTags(tags):
for tag in tags:
if tag['tagType'] == 'UdtType':
if tag['name'] == 'ANALOG':
tag['parameters'][u'newPram1'] = {u'dataType': u'String',u'value': u'Medium2'}
tag['parameters'][u'newPram2'] = {u'dataType': u'String',u'value': u'High2'}
elif tag['tagType'] == 'Folder':
processTags(tag['tags'])
filePath = "C:/<Some Path>/testExport.json"
filePath_Backup = "C:/<Some Path>/testExport_backup.json"
path = ['[Test]_types_']
# Backup to file
system.tag.exportTags(filePath=filePath_Backup, tagPaths = path, recursive=True)
# save json to variable
json_raw_results = system.tag.exportTags(tagPaths = path, recursive=True)
# decode to python dict:
py_results = system.util.jsonDecode(json_raw_results)
# find and modify the just the ANALOG tag
processTags(py_results['tags'])
# encode back to a json string
json_mod_results = system.util.jsonEncode(py_results, 4)
# write to file for import
system.file.writeFile(filePath, json_mod_results, False)
# write to system
basePath = "[Test]"
system.tag.importTags(filePath=filePath, basePath=basePath, collisionPolicy='o')
# The below also works
#system.tag.configure(basePath, py_results, "o")
This looks promising. The only advantage I see for system.tag.getConfiguration is that you do not have to jsonDecode it, but it really is not an advantage because of the other issues.
I had a note on one of my lists to see if system.tag.exportTags would work better, but I think at the time I still had a gateway or two that had not been updated to 8.1.8 or higher which would have required writing to a file and then reading it back in.