We need to grab / export data from a PLC, data is coming through OPC UA, to CSV file. Initially, we are interested in tagname, datatime, value and Quality. The exported file should have the records for the last 4 hours, which contains data for every 10 mins value. This file will be generated every hour. It means that each file will have 25 records every hour, i.e File202205022100 file for 2nd of May 21H00 will have records from 17H00, 17H10, 17H20 …21H00.
Please, anybody can suggest the better approach for this. Any help would be appreciated.
Tim, how would you schedule the tag historian to record on the hour and every ten minutes? Surely this is a job for a scheduled gateway event script set to 0/10 * * * * or else a transaction group with a 10 minute trigger starting on the hour?
I think a transaction group would work well here also, good call. They’re never my first thought for some reason. Here’s what I think is the right setting:
That’s the setting for the tag history. If the OP wants this to occur on the hour and every 10 minutes then a 10-minute tag group would need to be set up. (The OP has a choice to use a transacton log as described in my first post or this method in the tag historian.)
The driving expression should be set to dateExtract(now(1000), "minute") % 10
The modulo arithmetic and the One Shot: true should cause this to fire once on the hour and every ten minutes after.
You then set the tag to sample based on that tag group.
Thanks you both guys for your thoughts. It has been really detalied for tag historian. It is appreciated. I wonder whether the approach would be by transaction group, how it looks like. If I am not wrong, I assume that we can trigger the transaction group by schedule, where we match exactly every 10 mins, starting on HH00 time. is it correct? However, I am not sure how to pass data to CSV file. Perhaps it would be by the use of system.dataset.exportCSV function…any suggestion? thanks in advance.
Watch one thing. The example uses the filename data.csv which means that you could have a conflict if multiple clients were triggered at the same time. The standard web technique to avoid this is to use a random filename for the task and then delete it when done. e.g.,
import random
filename = "data%s.csv" % (random.randint(0, 1000000))