Script: Tag Values exported to CSV every 1H, with data records for every 10 mins

Hi there,

We need to grab / export data from a PLC, data is coming through OPC UA, to CSV file. Initially, we are interested in tagname, datatime, value and Quality. The exported file should have the records for the last 4 hours, which contains data for every 10 mins value. This file will be generated every hour. It means that each file will have 25 records every hour, i.e File202205022100 file for 2nd of May 21H00 will have records from 17H00, 17H10, 17H20 …21H00.

Please, anybody can suggest the better approach for this. Any help would be appreciated.

Set up your tags to historize appropriately (as periodic with a sample rate & max time of 10 minutes I think)

To convert, I would suggest a scheduled gateway event. In the script, you’ll need to retrieve your tag history. Once you have your results, store them to CSV using standard functions. See docs here:

To write the CSV to file:

import os
temp_filename = system.file.getTempFile('csv')
1 Like

Tim, how would you schedule the tag historian to record on the hour and every ten minutes? Surely this is a job for a scheduled gateway event script set to 0/10 * * * * or else a transaction group with a 10 minute trigger starting on the hour?

1 Like

I think a transaction group would work well here also, good call. They’re never my first thought for some reason. Here’s what I think is the right setting:


That’s the setting for the tag history. If the OP wants this to occur on the hour and every 10 minutes then a 10-minute tag group would need to be set up. (The OP has a choice to use a transacton log as described in my first post or this method in the tag historian.)

The driving expression should be set to
dateExtract(now(1000), "minute") % 10
The modulo arithmetic and the One Shot: true should cause this to fire once on the hour and every ten minutes after.
You then set the tag to sample based on that tag group.

Historical tag group

1 Like

Thanks you both guys for your thoughts. It has been really detalied for tag historian. It is appreciated. I wonder whether the approach would be by transaction group, how it looks like. If I am not wrong, I assume that we can trigger the transaction group by schedule, where we match exactly every 10 mins, starting on HH00 time. is it correct? However, I am not sure how to pass data to CSV file. Perhaps it would be by the use of system.dataset.exportCSV function…any suggestion? thanks in advance.

That sounds appropriate. I’ve never tried it. Maybe someone else can respond. The manual gives an example.

Watch one thing. The example uses the filename data.csv which means that you could have a conflict if multiple clients were triggered at the same time. The standard web technique to avoid this is to use a random filename for the task and then delete it when done. e.g.,

import random
filename = "data%s.csv" % (random.randint(0, 1000000))

(Check this in the script console. I haven’t!