Seeking Recommendations for Efficient Data Extraction in Ignition Environment

Hello! I'm currently working with Ignition and utilizing the basic SQL historian. Unfortunately, this setup doesn't offer efficient tools for data extraction from outside the Ignition environment, and the existing tools within Ignition are somewhat limited when it comes to extracting large amounts of data.

For instance, we must generate a weekly report that exports data from 500 tags at 1-second intervals. To make this work, we've split the task into multiple smaller reports using the system.tag.queryTagHistory function to prevent timeouts. This approach is not optimal and can be time-consuming.

I believe we have two possible options to address this issue:

  1. Develop an external tool to query our database for the required data.
  2. Upgrade our historian to a more advanced and feature-rich alternative.

I'm seeking recommendations for either of these options. I've heard PI and Canary thrown around a bit. Could anyone suggest any proven, stable, and easily integrated solutions?

Thanks in advance!

Purchase the WebDev module and then you can create web method calls to retrieve the historical data into either JSON, CSV, etc.
Then whatever external system you are using can just issue the calls against it and get the data back.

You can also display it using the WebDev module as well.

Writing a module is also an option (costing only your time :slight_smile: e.g. ignition-extensions/HistoryServlet.kt at main · IgnitionModuleDevelopmentCommunity/ignition-extensions · GitHub)

What's the ultimate destination for this data? It might be easier to directly store the data into that system at storage time, rather than attempt to query -> export -> import into the other system asynchronously downstream.

I was under the impression, with WebDev, we would still be using the same underlying python methods available in ignition to extract historical tag data, correct?

Different destinations. Without getting into too much detail, some are for testing, and some are for warranty requirements and monitoring. I will look at the module extensions. What about third party historians? If they have built-in tools for data extraction it may be the way to go. It's possible we have more money than time for this.

Off the shelf, right now, there's InfluxDB and Canary modules available. I cannot personally attest to their quality or ease of data egress; I haven't used them myself. If you need a solution tomorrow, that's probably where I would go first. You might be surprised how far webdev gets you though. Keep in mind that if you're using Webdev, you are in charge of how the data gets written out over the wire; if you're currently doing system.tag.queryTagHistory from the designer or in a Vision client, you're paying a huge penalty for the gateway serializing and the client deserializing the data from the fairly expensive XML format it's currently written in over the wire.

With a webdev endpoint, it's relatively easy to "stream" JSON results most third parties can understand, and because it's "streaming" you're paying a lot less in terms of memory.

Thanks, @PGriffith. Could you direct me to any relevant documentation that describes the performance improvements of making requests from within WebDev? I've searched online, and Ignition's documentation but couldn't find much performance information.

Consider using wide tables instead of the tag historian (via the SQL Bridge module or scripted inserts). If you are always recording at one-second intervals, the tag historian is terribly inefficient. A side benefit is that wide tables are very friendly to external queries.

3 Likes