Hourly CSV to historian

Continuing my energy metering project …

Some of my energy meters upload CSV files by FTP to my server. The tables are in narrow format so I did a little testing and can get Python to open and parse the files.

ExportID               Date                  Value
Temperature T1         2017-02-25T23:15:00   36.3
Temperature T2         2017-02-25T23:15:00   36.2
Meter1 kWh             2017-02-25T23:15:00   175814
Meter2 kWh             2017-02-25T23:15:00   3221679

Q1. Am I correct in using system.tag.storeTagHistory to allow me to write both t_stamp and tag value?
Q2. How do I prevent double entries if a CSV file is parsed twice (during development or after a fault).
Q3. How should I run the script as a scheduled task, say, once an hour? Should I create a scheduled task / cron to run a client and have the client exit when complete?

Any ideas to prevent me going off on a difficult path would be very welcome.

  1. If you ultimately want this data to go into Ignition’s tag historian and be retrieved on, for instance, Easy Charts, then you can use system.tag.storeTagHistory - although this function is not guaranteed to scale well.
  2. If it’s on an FTP server, then I would just get it off the server, then upload it back with a different filename or extension to indicate that it’s been entered in.
  3. A scheduled task could be a gateway timer script that evaluates, say, every 60s (and checks if it’s the top of the hour before continuing execution) or you could use a transaction group on a schedule to flip a boolean tag, then a tag event script on that tag to fire the actual script. Either of these would be much, much better in terms of performance than trying to launch a client.