I would like to automate historical data collection (data logs) from 100’s of remote RTUs. These devices can send timestamped CSV/JSON files upon exception or periodically by secured FTP to designated folders on a Linux server. I want to monitor/scan these folders periodically for any new file arrival and automatically import/merge these data logs to Ignition Historian.
I use a gateway timer script to poll for incoming files. After processing, I move them to an archive folder for later reprocessing or deletion.
import os, shutil
inputPath = '/path/to/input/folder/'
archivePath = '/path/to/archive/folder'
# get list of files
files = os.listdir(inputPath)
if len(files) > 0:
for file in files:
inputFile = inputPath + file
dataIn = system.file.readFileAsString(inputFile).splitlines()
for row in dataIn:
# process stuff here
# move the files to the archive folder
for file in files:
shutil.move(inputPath + file, archivePath + file)
Thanks Jordan. I always love your solutions because you hit the nail on it’s head. BTW, i don’t want any load or dependency on Ignition as i would like to have a generalized solution. Is it NOT possible to have an independent solution like php/python scripting?. Can you please look into the second part of automatically importing into the Ignition historian db of mysql?.
something like this would put a csv file into a db table. pymsql is pure python.
# csv to MySQL
import os, shutil, csv, pymysql
inputPath = '/path/to/input/folder/'
archivePath = '/path/to/archive/folder'
# get list of files
files = os.listdir(inputPath)
if len(files) > 0:
# set MySQL connection
connection = pymysql.connect(host='host.com', user='', password='', db='', cursorclass=pymysql.cursors.DictCursor)
# set query
query = 'INSERT INTO table VALUES(%s, %s, %s, %s, %s)'
# loop through files
for file in files:
inputFile = inputPath + file
# open csv file
dataIn = csv.reader(inputFile, delimiter =',' quotechar ='"')
# write to db
with connection.cursor() as cursor:
for line in dataIn:
cursor.execute(sql, (line[0]))
# commit and close the MySQL connection
connection.commit()
connection.close()
# move the files to the archive folder
for file in files:
shutil.move(inputPath + file, archivePath + file)
As far as putting it into the Ignition historian without using Ignition, I can’t help you there, because I don’t use the historian. None of the applications I have here have any need of it.
Ignition’s Historian uses several tables for managing the state of tracked values.
From what I can tell:
sqlth_te - stores a directory of every tracked tag path, along with a generated tagid.
sqlth_partitions - stores a directory of every historian-partitioned table.
sqlt_data_1_2017_11 - and similar names (depending on your partitioning settings) stores timestamps and values for each tracked tagid.
You might be able to add entries manually if you look-up the correct tagid and current partition table, and insert a new row in the expected format. Never tried it though!
Found it interesting. Need to dig deeper. Working on a non-ignition dependent solution as well. Will keep posting.
…
Why FTP instead of MQTT for IOT?:
MQTT is an excellent publish/subscribe protocol for monitoring and controlling remote devices. But it’s not possible to download remote datalog files with MQTT or any other protocol. The only best solution available is secured SFTP. FTP can also act as a redundant solution for receiving remote data. My best bet for robust IOT solution is “MQTT + FTP”.
Edge device manufacturers must look into “MQTT + FTP” solution.
This can be achieved by creating corresponding memory Tags in Ignition and writing tag values directly in Ignition memory tags.thru a gateway communication module. The tags will get written in Historian as well can be visualized on Ignition screens like any PLC tags. My NJ-SCADA module can do this job of scanning the files received from FTP server and reading them and updating the Ignition Memory tags. It can also display tags on its own DASH boards and Tables and P&ID's.
Very impressive. My client has a strange requirement. For data analysis, he wants both remote and local time stamps to be recorded on a separate database. Remote RTUs can send time stamped datalogs and Alarm logs as csv by FTP. He needs a special Alarm viewer component for SCADA which can display both time stamps. Is it possible for you to build one for Ignition?.
I think, your NJ SCADA can become a very good partner for main stream SCADA, which is highly customizable for specific requirements.
Not sure if Ignition’s Alarm table can be customized to include user defined field, I need to explore. As far as storing is concerned perhaps alarms can be defined as UDT’s in Ignition to store the RTU time stamp as additional field.
As far as NJ SCADA is concerned, I can easily store the additional column for RTU time stamp in addition to server time stamp when its received. The Alarms can be displayed on an HTML page or SCADA graphic page.
yes, its a nodejs application so it can run on any platform supporting nodejs.[quote="R.Alamsha, post:14, topic:16845"]
.
Can you customize it to retain last tag values, as discussed?.
[/quote]
i think yes.
yes i can
It can act like a driver to Ignition gateway server , pumping values to ignition received from your FTP server as well receiving commands like button pressed etcfrom ignition and sending to any device or RTU (thru your FTP module?) [quote="R.Alamsha, post:14, topic:16845"]
Any tag limits?.
[/quote].
No
[quote="R.Alamsha, post:14, topic:16845"]
What feature you think is the main differentiator from others?.
[/quote]This is based on latest technologies of NJ/JS/SVG etc
Best thing is to pilot it on a sample test case so that we get an idea of what possible and what not.
This is the key. The Modbus TCP driver should work seamlessly as a Master and Slave simultaneously to accommodate both parties on either side. Your product has the potential to become a killer solution for IOT. Just fine tune it. I suggest, you need to talk to some big Infrastructure automation contractors in India asap (you know them). They need it. Even i may become your client one day
What is the benefit for using FTP against say SQL queries to sync tables ?
So for example I use MQTT for live data of a machine. As we dont have a guaranteed connection due to using a cell service all data for Audit and Data logs is stored to a local SQL instance, then synced once a connection is available.
Would using FTP be a benefit in this scenario ?
Are you converting SQL tables to CSV and wrapping them up with images etc to FTP them to unwrap then in ignition ?
I think mr alamsha is suggesting use of FTP for data transfer to ignition instead of any other mechanism like MQTT or PubNub. In my opinion PubNub is a better approach
Where's the local SQL instance at the remote RTU?.
How the RTU csv datalogs will be transferred to the local SQL?.
RTU can be configured to retry, say every 3 minutes, in the event of FTP delivery failure. This feature is available by default in any RTU which is designed to deliver data by FTP.
In the CCC. the FTP files are automatically imported into the SCADA SQL historian database.