How do you get a system.tag.browse results to a python dataset then get that into a Dataset Memory Tag? I get stuck with the data.append() when trying to build a dataset from the tag.browse(). I looked at the append.toDataset(), can’t get my head around this…
One way you could do it is by building a list out of the returned data by appending the values from each result returned from your browse to an empty list (in this case I used listHolder). Specify the headers of the result data in another list, and using system.dataset.toDataSet you can convert all of that into a dataset. Once you have that, you can write the dataset to a tag! Here, I only went to one level of browse, but you could always do something very similar for a multi-level browse.
I recommend taking a look at the Ignition documentation for dictionaries and lists, as it goes into more detail on this style of manipulating data.
https://docs.inductiveautomation.com/display/DOC80/Dictionaries
https://docs.inductiveautomation.com/display/DOC80/Lists+and+Tuples
results = system.tag.browse(path = '[default]')
listHolder=[]
for result in results.getResults():
listHolder.append(result.values())
headers=["fullPath","hasChildren","name","tagType"]
dataSet=system.dataset.toDataSet(headers,listHolder)
system.tag.write("[default]Test Tag",dataSet)
Thank you @nathanielbrown for the prompt reply! That is great, simple… I get crossed up with the .toDataSet, .appendDataset, .getComponent, .write and .writeBlock…
doesn’t ‘result’ contain the correct formatted data without using .value()? Because, this dataset will be only two columns where I have figured out I can get a string using result[‘fullPath’] and result[‘name’]…
@hwbrill each “result” is a dictionary containing key-value pairs, where the keys are those I specified in the headers code. If you only want those two columns, you could just get the string values using:
listHolder.append([str(result['fullPath']),str(result['name'])])
Then, just make the headers those two columns.
Thank you… so, when I click the function, it takes me to the system.dataset.toDataSet doc in the manual. When I look for the Scripting Function Index, I don’t see it… Am I reading into it too much? This is where I am having a problem with following what functions are the best solution…
Yes!
system.util.jsonDecode(str(system.tag.browse(path,filter)))
That function has multiple formats. The first one in the manual shows it taking a single, solitary argument: a PyDataset to be converted into a plain Dataset.
now I am having trouble figuring out how to iterate through a passed PyDataset… The following keeps printing out the row as one col in def getFPlist():, no matter what I do…
# Dataset memory tag configuration properties
dataType='DataSet'
tagType='AtomicTag'
valueSource='memory'
# Dataset memory tag configuration list
config={'valueSource':valueSource,'dataType':dataType,'name':'FP List','tagType':tagType}
data=[]
def getFPlist(fpList):
for r in fpList:
for c in fpList:
print fpList[r][c]
return
def browseTags(path,filter):
results=system.tag.browse(path,filter)
for result in results.getResults():
if result['hasChildren']==True and str(result['tagType'])!='Folder':
fullPath=str(result['fullPath'])
splitPath=fullPath.split('/')
c_name=splitPath[1]
desk_name=splitPath[2]
fp_name=splitPath[3]
data.append([c_name,desk_name,fp_name])
print c_name,desk_name,fp_name
elif str(result['tagType'])=='Folder':
browseTags(str(result['fullPath']),filter)
system.tag.configure('Schafer',[config],'o')
header=['c_name','desk_name','fp_name']
table=system.dataset.toDataSet(header,data)
# system.tag.write('Schafer/FP List',table)
fpList=system.dataset.toPyDataSet(table)
browseTags('Schafer',{})
getFPlist(fpList)
the markdown didn’t indent the code in the getFPlist() function but it runs in the python console…
Don’t use plain markdown. This forum is built on discourse. Use ``` markers above and below your code.
You probably want
def getFPlist(fpList):
for r in fpList:
for c in r:
print(c)
return
Your code wouldn’t compile as posted, but I assume you simply chopped it down incorrectly.