Hi Forum,
we are currently trying to get data from multiple edges into a redundant Ignition Gateway. And from there we want to provide the data to our site historian/DB (Postgres in the backend).
Our site historian/DB can import tags via OPC-UA (subscribing or polling an OPC-UA server, or by reading CSV files.
- Using the internal OPC-UA Server. Issue: Data is not backfilled in case of connection lost between the edges and the central gateway
- Export historian to csv Issue: The idea was to query all tags every minute or so. But the script takes very long and the csv is not user friendly. It repeats the same time over all tags. Is this the normal behavior?
We tried it with 500tags which changes in a simulation every second.
Csv file structure:
t_stamp; Tag1; Tag2; Tag3; Tag n-1; Tag n
"2025-02-28 12:37:53.000","670"
"2025-02-28 12:37:53.000","670","670"
"2025-02-28 12:37:53.000","670","670","670"
"2025-02-28 12:37:53.000","670","670","670","670"
"2025-02-28 12:37:53.000","670","670","670","670","670"
"2025-02-28 12:37:53.000","670","670","670","670","670","670"
"2025-02-28 12:37:53.000","670","670","670","670","670","670","670"
The epoch is 100% the same for all tags, also in the historian.
#Query tag history
endTime = system.date.now()
startTime = system.date.addSeconds(endTime, -60)
historicalData = system.tag.queryTagHistory(paths=paths,
startDate=startTime,
endDate=endTime)
csv = system.dataset.toCSV(historicalData)
system.file.writeFile(r"C:/myExportsTest.csv", csv)