Can PowerChart graphs be populated from a sproc?

I believe the answer is no, but maybe someone has already done or has an idea for how to implement an ad-hoc trending tool with the cool features that PowerChart has (tag browser and aggregation mode for longer timeframes) but with the flexibility to populate the chart with a sproc like you can with a Time Series chart.

Also, does PowerChart's aggregation mode depend on having the 'Enable Pre-processed Partitions' checkbox checked to return min, max, avg etc from the database or will Ignition generate SQL to calculate this on the fly if the pre-processed partition doesn't exist?

Honestly, I’ve looked at the SQL that PowerChart generates in our SQL profiler and it’s pretty rough - like prototype rough, not a polished final product at all. When doing a live chart, it will do about 3 database transactions just to figure out what partition it needs to query from (even if there’s only one) and then it refreshs the whole frame rather than just refreshing the last few seconds (or whatever live interval you have) and pushing the oldest part of the trend off the back. I’d file a database hurt feelings report if I knew where to send it.

Ignition version is 8.3.2. I’m using a new-ish jdbc driver for MSSQL (13.2.1) as well.

If you don’t know about Embr Charts, then get to know about Embr charts.

Perspective native charting won’t have major improvements until 2027.

1 Like

You are correct; the power chart is limited to only work with the historian, not raw datasets.

No, the historian will run whatever aggregation queries it needs to.

This is a hair split, but a distinction worth teasing out.
The powerchart doesn't generate any SQL. The only thing the power chart "knows" about are the history providers configured on the gateway; it's operating a layer of abstraction up from any raw SQL. This is good for flexibility (there are first and third party modules that don't use SQL databases at all for history storage) but bad for some kinds of optimization.

This is necessary due to the architecture of the historian. One database can be used by any number of gateways to store history, and the SQL historian must run those extra queries to determine where the history for this particular tag, in this particular tag provider, on this particular gateway, is storing history. Imagine the disaster if you started getting another set of tag history silently just because of a name collision.

This one I don't have an excuse for. Vision does exactly this optimization in its history querying, but it may have never made it to Perspective or it may not be working.

2 Likes

This, I suspect. The Power Chart makes me want to pull my hair out.

1 Like

We’ve set up a POC with Embr Charts and… it seems to work pretty well but customization through the JS-Jython bridge feels unpolished to a new user. To help smooth that transition could someone point to a resource or two with working examples - preferrably one with a live chart pulling just the fresh data needed from a SQL source and purging the oldest data falling outside the display window?

Or is there another tool better suited for the job?

@bmusson would probably be willing to help with an example in this topic:

There could be one buried within the main thread for the module, as well.

I know it's not the same as a live-chart, but here's an example of an infinite scrolling chart with automatic data density upgrades I've been working on:

Basic gist:

  • A custom Chart.js plugin is installed that includes a basic timeseries cache.
  • As the chart is panned/zoomed, the cache is queried.
  • If the cache doesn't have the values at the required density (1 data point per pixel), a message is sent to gateway to load the data.
  • The gateway queries for the data, then sends it back to the chart where it is stored in the cache.

This demonstrates the basic approaches of:

  1. Hooking into the chart's lifecycle.
  2. Triggering the gateway to do something from the chart.
  3. Triggering the chart to do something from the gateway.

In order to do live updates, you could:

  1. Setup a polling event on the chart (and storing it for later for teardown).
  2. Use that polling event to request data from the gateway.
  3. When the chart receives the new data, you can update the range of the chart and prune old data.
view.json
{
  "custom": {},
  "params": {},
  "props": {},
  "root": {
    "children": [
      {
        "meta": {
          "name": "Chartjs"
        },
        "position": {
          "height": 285,
          "width": "100%"
        },
        "props": {
          "data": {
            "datasets": [
              {
                "label": "Dataset 1"
              }
            ]
          },
          "events": {
            "chart": {
              "lifecycle": {
                "onMount": "(chart) \u003d\u003e chart.$infiniteScroll.requestHistory()"
              }
            }
          },
          "options": {
            "animation": false,
            "elements": {
              "point": {
                "radius": 0
              }
            },
            "plugins": {
              "zoom": {
                "limits": {
                  "x": {
                    "minRange": 3600000
                  },
                  "y": {
                    "max": "original",
                    "min": "original",
                    "minRange": null
                  }
                },
                "pan": {
                  "modifierKey": null,
                  "onPanComplete": "(context) \u003d\u003e context.chart.$infiniteScroll.onScaleChange()"
                },
                "zoom": {
                  "drag": {
                    "enabled": false
                  },
                  "onZoomComplete": "(context) \u003d\u003e context.chart.$infiniteScroll.onScaleChange()",
                  "wheel": {
                    "modifierKey": null
                  }
                }
              }
            },
            "scales": {
              "x": {
                "type": "timestack"
              },
              "y": {
                "type": "linear"
              }
            }
          },
          "plugins": [
            {
              "beforeInit": "(chart, args, options) \u003d\u003e {\n    if (!options) return;\n    if (chart.$infiniteScroll) return;\n\n    const state \u003d chart.$infiniteScroll \u003d {\n        destroyed: false,\n        cache: null,\n        inflight: new Set(),\n    };\n\n    const config \u003d {\n        datasets: [],\n        margin: 3600*1000,\n        dataSource: {\n            history: (context) \u003d\u003e perspective.sendMessage(\u0027data-request\u0027, {\n                type: \u0027history\u0027,\n                start: context.start,\n                end: context.end,\n                density: context.density,\n            })\n        },\n        ...options,\n    };\n\n    if (!config.dataSource?.history) {\n        console.error(\"infinite scroll plugin requires dataSource.history\");\n        return;\n    }\n\n    /* ---------------------------------\n     * Timeseries Cache with density upgrade\n     * --------------------------------- */\n    function createTimeseriesCache() {\n        let chunks \u003d [];\n\n        function insertChunk(newChunk) {\n            if (!Array.isArray(newChunk.data) || !Array.isArray(newChunk.data[0])) {\n                throw new Error(\"Chunk.data must be array of arrays, one per dataset\");\n            }\n\n            const updatedChunks \u003d [];\n\n            for (const chunk of chunks) {\n                // Non-overlapping: keep\n                if (chunk.end \u003c\u003d newChunk.start || chunk.start \u003e\u003d newChunk.end) {\n                    updatedChunks.push(chunk);\n                    continue;\n                }\n\n                // Overlap: split lower-density chunks\n                if (chunk.density \u003c newChunk.density) {\n                    if (chunk.start \u003c newChunk.start) {\n                        updatedChunks.push({\n                            ...chunk,\n                            start: chunk.start,\n                            end: newChunk.start,\n                        });\n                    }\n                    if (chunk.end \u003e newChunk.end) {\n                        updatedChunks.push({\n                            ...chunk,\n                            start: newChunk.end,\n                            end: chunk.end,\n                        });\n                    }\n                    // Overlapped portion is removed\n                } else {\n                    updatedChunks.push(chunk); // Keep equal/higher-density\n                }\n            }\n\n            updatedChunks.push(newChunk);\n            chunks \u003d updatedChunks.sort((a, b) \u003d\u003e a.start - b.start);\n        }\n\n        function collect(start, end, requestedDensity, datasetCount) {\n            const result \u003d Array.from({ length: datasetCount }, () \u003d\u003e []);\n            const relevantChunks \u003d chunks.filter(c \u003d\u003e c.end \u003e start \u0026\u0026 c.start \u003c end);\n\n            for (let i \u003d 0; i \u003c datasetCount; i++) {\n                const pointsByX \u003d new Map();\n\n                for (const c of relevantChunks) {\n                    const data \u003d c.data[i];\n                    if (!data) continue;\n\n                    for (const pt of data) {\n                        if (pt.x \u003e\u003d start \u0026\u0026 pt.x \u003c\u003d end) {\n                            const existing \u003d pointsByX.get(pt.x);\n                            if (!existing || existing.density \u003c c.density) {\n                                pointsByX.set(pt.x, { ...pt, density: c.density });\n                            }\n                        }\n                    }\n                }\n\n                result[i] \u003d Array.from(pointsByX.values())\n                    .sort((a, b) \u003d\u003e a.x - b.x)\n                    .map(pt \u003d\u003e ({ x: pt.x, y: pt.y }));\n            }\n\n            return result;\n        }\n\n        function computeGaps(start, end, requestedDensity) {\n            const gaps \u003d [];\n            let cursor \u003d start;\n\n            // Sort chunks by start\n            const relevantChunks \u003d chunks\n                .filter(c \u003d\u003e c.end \u003e start \u0026\u0026 c.start \u003c end)\n                .sort((a, b) \u003d\u003e a.start - b.start);\n\n            for (const chunk of relevantChunks) {\n                // Skip chunks fully below cursor\n                if (chunk.end \u003c\u003d cursor) continue;\n\n                // Gap before this chunk\n                if (chunk.start \u003e cursor) {\n                    gaps.push({ start: cursor, end: chunk.start });\n                }\n\n                // If chunk density \u003c requested, need to request this region\n                if (chunk.density \u003c requestedDensity) {\n                    const gapStart \u003d Math.max(cursor, chunk.start);\n                    const gapEnd \u003d Math.min(chunk.end, end);\n                    gaps.push({ start: gapStart, end: gapEnd });\n                }\n\n                cursor \u003d Math.max(cursor, chunk.end);\n            }\n\n            // Gap after last chunk\n            if (cursor \u003c end) gaps.push({ start: cursor, end });\n\n            // Merge adjacent gaps\n            const merged \u003d [];\n            for (const g of gaps) {\n                if (!merged.length || g.start \u003e merged[merged.length - 1].end) {\n                    merged.push({ ...g });\n                } else {\n                    merged[merged.length - 1].end \u003d Math.max(merged[merged.length - 1].end, g.end);\n                }\n            }\n\n            return merged;\n        }\n\n        function pruneOutside(min, max) {\n            chunks \u003d chunks.filter(c \u003d\u003e c.end \u003e min \u0026\u0026 c.start \u003c max);\n        }\n\n        return { insertChunk, collect, computeGaps, pruneOutside };\n    }\n\n    /* ---------------------------------\n     * Chart Setup\n     * --------------------------------- */\n    chart.data.datasets \u003d config.datasets;\n    chart.options.scales ??\u003d {};\n    chart.options.scales.x ??\u003d { type: \"timestack\" };\n    state.cache \u003d createTimeseriesCache();\n\n    /* ---------------------------------\n     * applyData (async callback)\n     * --------------------------------- */\n    chart.$infiniteScroll.applyData \u003d (rows, density, range) \u003d\u003e {\n        if (!rows?.length) return;\n        const datasetCount \u003d rows[0].series.length;\n        const perDataset \u003d Array.from({ length: datasetCount }, () \u003d\u003e []);\n\n        for (const row of rows) {\n            row.series.forEach((pt, i) \u003d\u003e perDataset[i].push({ x: pt.x, y: pt.y }));\n        }\n\n        state.cache.insertChunk({\n            start: range.min,\n            end: range.max,\n            density,\n            data: perDataset,\n        });\n\n        state.inflight.delete(`${range.min}:${range.max}:${density}`);\n        renderFromCache();\n    };\n\n    /* ---------------------------------\n     * Render from cache\n     * --------------------------------- */\n    function renderFromCache() {\n        const xScale \u003d chart.scales?.x;\n        if (!xScale) return;\n\n        const min \u003d xScale.min;\n        const max \u003d xScale.max;\n        const density \u003d chart.width / (max - min);\n        const datasetCount \u003d chart.data.datasets.length;\n\n        const collected \u003d state.cache.collect(min, max, density, datasetCount);\n        for (let i \u003d 0; i \u003c datasetCount; i++) {\n            chart.data.datasets[i].data \u003d collected[i] || [];\n        }\n\n        const margin \u003d (max - min) * 2;\n        state.cache.pruneOutside(min - margin, max + margin);\n\n        chart.update(\"none\");\n    }\n\n    /* ---------------------------------\n     * Request history (density upgrade aware)\n     * --------------------------------- */\n    function requestHistory() {\n        const xScale \u003d chart.scales?.x;\n        if (!xScale) return;\n\n        const min \u003d xScale.min;\n        const max \u003d xScale.max;\n        const density \u003d chart.width / (max - min);\n\n        const gaps \u003d state.cache.computeGaps(min, max, density);\n        for (const gap of gaps) {\n            const key \u003d `${gap.start}:${gap.end}:${density}`;\n            if (state.inflight.has(key)) continue;\n\n            state.inflight.add(key);\n            config.dataSource.history({\n                chart,\n                config,\n                density,\n                start: gap.start,\n                end: gap.end,\n            });\n        }\n    }\n    chart.$infiniteScroll.requestHistory \u003d requestHistory;\n\n    /* ---------------------------------\n     * Scale change\n     * --------------------------------- */\n    chart.$infiniteScroll.onScaleChange \u003d () \u003d\u003e {\n        if (!state.destroyed) {\n            requestHistory();\n            renderFromCache();\n        }\n    };\n\n    /* ---------------------------------\n     * Destroy\n     * --------------------------------- */\n    const originalDestroy \u003d chart.destroy;\n    chart.destroy \u003d function () {\n        state.destroyed \u003d true;\n        config.dataSource.destroy?.(chart);\n        originalDestroy.apply(this, arguments);\n    };\n};",
              "events": [
                "mousemove",
                "mouseout",
                "click",
                "touchstart",
                "touchmove"
              ],
              "id": "infiniteScroll"
            }
          ]
        },
        "scripts": {
          "customMethods": [
            {
              "name": "get_history_data",
              "params": [
                "start",
                "end",
                "density"
              ],
              "script": "    \"\"\"\n    Generate synthetic test points between start and end,\n    using the requested density (points per millisecond).\n    Returns a list of dicts: { \u0027series\u0027: [ {x, y}, ... ] }\n    \"\"\"\n    as_date \u003d lambda x: system.date.fromMillis(int(x))\n    \n    now \u003d system.date.now()\n    start_ms \u003d system.date.toMillis(self.min_date(as_date(start), now))\n    end_ms \u003d system.date.toMillis(self.min_date(as_date(end), now))\n\n    # Compute number of points from density\n    points \u003d max(2, int((end_ms - start_ms) * density))\n    dt \u003d (end_ms - start_ms) / float(points - 1)\n\n    results \u003d []\n\n    for i in range(points):\n        x \u003d start_ms + i * dt\n        data_point \u003d self.generate_data_point(x)\n        results.append({\u0027series\u0027: [data_point]})\n\n    return results"
            },
            {
              "name": "generate_data_point",
              "params": [
                "x"
              ],
              "script": "\timport math\n\t\n\tamplitude \u003d 1\n\tfrequency \u003d 1\n\t\n\tt \u003d x / 10000000.0\n\ty \u003d amplitude * math.sin(2 * math.pi * frequency * t)\n\t\n\treturn {\n\t\t\u0027x\u0027: x,\n\t\t\u0027y\u0027: y\n\t}"
            },
            {
              "name": "min_date",
              "params": [
                "a",
                "b"
              ],
              "script": "\tif (system.date.isBefore(a, b)):\n\t\treturn a\n\telse:\n\t\treturn b"
            }
          ],
          "extensionFunctions": null,
          "messageHandlers": [
            {
              "messageType": "data-request",
              "pageScope": true,
              "script": "\tdata \u003d []\n\trequest_type \u003d payload.get(\u0027type\u0027)\n\t\t\n\tif (request_type \u003d\u003d \u0027history\u0027):\t\n\t\tstart \u003d payload.get(\u0027start\u0027)\n\t\tend \u003d payload.get(\u0027end\u0027)\n\t\tdensity \u003d payload.get(\u0027density\u0027)\n\t\trows \u003d self.get_history_data(start, end, density)\n\t\t\t\t\n\t\tif len(rows) \u003d\u003d 0:\n\t\t\treturn\n\t\t\t\n\t\tproxy \u003d self.getJavaScriptProxy()\n\t\tproxy.runAsync(\u0027\u0027\u0027(rows, density, range) \u003d\u003e {\n\t\t\tconsole.log(\u0027history returned\u0027, rows)\n\t\t\tthis.$infiniteScroll?.applyData(rows, density, range)\n\t\t}\u0027\u0027\u0027, {\n\t\t\t\u0027rows\u0027: rows,\n\t\t\t\u0027density\u0027: density,\n\t\t\t\u0027range\u0027: {\n\t\t\t\t\u0027min\u0027: start,\n\t\t\t\t\u0027max\u0027: end\n\t\t\t}\n\t\t})",
              "sessionScope": false,
              "viewScope": false
            }
          ]
        },
        "type": "embr.chart.chart-js"
      }
    ],
    "meta": {
      "name": "root"
    },
    "type": "ia.container.coord"
  }
}

I know, it's a lot a manual set up. My only concern so far has been making these things possible.

The future development effort is going into making it easier to write and reuse this JavaScript code, meaning you (or community members) can write this functionality once then apply it across multiple charts.

5 Likes