This is a stress test example of roughly 200k data points of simulated data using ChartJS in Ignition.
That is a great example!
Few questions.
-
I notice you seem to have different numbers of datapoints per tag. Are you pulling each tag with a separate tag history query?
-
If so, how does that affect your x-trace popup box?
-
How much of that time would you say it takes to do the query verses the rendering? There was a delay when you clicked the button but obviously some of that time is pulling the 200k points.
Thanks!
I am not using tag historian here. This is a custom historian, and the items on the left side menu are just KPIs for the specific assets.
The delay in the render is the query running and returning to the view (it is ran on the gateway and sent back to the client) but the chart render happens roughly when you see the first line come to the screen.
If you’re going to be plotting large (100k+) datasets, I’d recommend loading them in chucks to keep them out of the property tree.
I can answer this one. You have control over what points are displayed in the tooltip, by setting the mode and the axis.
This sample page does a good job of demonstrating the differences.
It's already pretty quick, but to go faster is amazing.
Got that need for speed.
I will play with doing it in chunks and see what the difference is. Makes sense to avoid the props tree overall.
On this, I think the OnChartUpdate
method, along with maybe a value of -1
disabling the update rate, might be a nice way to have all the data loading in one place. Otherwise you have to be grabbing the JSProxy from somewhere else and control loading there.
Worst case if you aren't loading any data at that update tick, you just include a pass
.
My biggest problem with the OnChartUpdate
is that the update rate lives in the props property tree. It’s convenient, I guess, but it doesn’t feel very «Perspective».
It’d be nice to have an additional property tree that was not synced with the client. I know you can change a property’s access to private
, but it’s not very obvious, and it doesn’t make sense for anything inside of props
, IMO.
I can understand that, it feels more like the advanced functions of the power table in Vision.
My main issue currently is linking constant polling with user moving bounds requiring the polled data to change and overall data to be trimmed.
Can you elaborate a bit?
Are you trying to build a combination realtime/historic chart?
That's exactly it. My planned attempt is going to be using runscript(somefunc, 15000)
and have a function that queues up data querying asynchronously and stuff it into a pageVarMap or something.
If user pans then they'll have to wait for the next poll. Panning updates the bounds throughout the motion so I can't trigger a poll until I see it settle.
I'm looking through pturmel's tag report utility to see if i can reuse his tag history queuing programming.
Mildly interesting interaction with chart data. If I update the component's data.dataset[x].data
property via a script or binding, the entire chart appears to redraw, which in turn resets the zoom level.
I found something similar online where passing the entire options
object to the chart would cause the zoom plugin to reset on render. [reactjs] zoom is resetting on render · chartjs/chartjs-plugin-zoom · Discussion #589 · GitHub
If I touch/load/modify the data using only a javascript proxy object, the chart receives the new data and updates the lines accordingly without resetting the zoom level.
Mildly related, runscript
doesn't appear to have any way to access the view to be able to get a proxy object off the chart. I tried passing in {view}
but I guess that's different than calling self.view
from a value change script.
For the time being I'm testing using a binding of now(15000)
and a value changed script that calls my loading function.
I’ve run into the same issue, but never came across the thread you linked. The Embr-Charts component is using the react-chartjs-2
library mentioned in that thread, so it’s likely related. I’ve been wanting to drop the react-chartjs-2
dependency for a while now, so maybe this is the push to do it. There’s nothing strictly wrong with it, it’s just unnecessary and doesn’t 100% fit with Perspective’s update model (as seen here, lol).
Exactly what type of object does it give you when passing a view as a runScript
parameter? That’s something I’ve never tried.
Use my objectScript()
instead of runScript()
. Try this on a view custom property:
objectScript("repr(dir())")
Then, on any of those local names, try this:
objectScript("system.reflect.object(someLocal)")
Psst! self.view
After messing a bit with it, it just gives you None
if you are touching the base node like {view}
or {/root}
. Same thing for pointing it at the component node ({/root/Chartjs}
). For what it's worth it 'technically' doesn't let you select those.
Once you point it to a property on those then it gives you that property value as its respective type. Looks like I'll need to use pturmel's objectscript
instead.
Edit: Was able to get basic polling working using objectScript('library.fetchHistory(self)', now(15000))
For anyone doing loading of data using a javascript proxy do NOT try to load 225k rows in one runblocking call. It will tank your gateway. Learn from my mistakes.
Thankfully this was our dev gateway.
I’m not surprised that this crashed somewhere, but I am surprised that it stopped more than the Perspective session.
I do see the default timeout durations are set incorrectly; runAsync/runBlocking
currently time out after 30,000 seconds instead of 30,000 milliseconds (darn Studio5000 brain). This same bug existed in an older version of Periscope’s runJavaScript
functions and obviously survived the port into the component proxy.
Looking at it further, I was calling the 225k write every 15 ish seconds, so i think it stacked over 1 perspective session and 1 designer session. We are also running a fairly small server, so it could just be a meager horsepower issue.
I was also running into the issue of my object script call failing because I was calling this.update
with null data in my datasets, so I think it was firing faster than the timing I had set up.
Got a good question earlier that I think is worth posting here.
Opening a Popup When a Point is Clicked
// options.onClick
(event, elements, chart) => {
console.log(elements)
const firstElement = {
x: elements[0]?.element.$context.parsed.x,
y: elements[0]?.element.$context.parsed.y
}
console.log(firstElement)
const popup = {
type: 'open',
id: 'my-popup',
viewPath: 'Charts/Chartjs/Resources/Popup',
viewParams: { text: JSON.stringify(firstElement) },
title: 'Title Goes Here',
modal: true,
overlayDismiss: true,
}
perspective.context.client.onScriptPopup(popup)
}
And here's a list of all the properties you can pass to onScriptPopup
:
export interface PopupActionConfig {
type: ViewVisibilityType;
id: string;
viewPath: string;
viewParams?: any;
title?: string;
position?: PopupPosition | RelativePopupPosition;
positionType: PopupLocationType;
showCloseIcon?: boolean;
draggable?: boolean;
resizable?: boolean;
modal?: boolean;
overlayDismiss?: boolean;
}
export type ViewVisibilityType = "open" | "close" | "toggle";
export type PopupLocationType = "exact" | "relative";
export interface PopupPosition {
width?: number;
height?: number;
left?: number;
top?: number;
right?: number;
bottom?: number;
}
export interface RelativePopupPosition {
width?: number;
height?: number;
top?: number;
left?: number;
relativeLocation: string;
}
Version 2.0.3 Released
Includes an update to Chart.js 4.4.8
and a correction to the proxy timeout values.
I managed to get this to load by chunking it and it only takes about 7 seconds to load that many rows, when loading in chunks of 20K. Data retrieval on my system ( to provide to the chart) was only about 2s in front of that so total time is ~9 seconds.
On top of that, I can freely scroll the entire range of the data with no delay once its loaded, so this is absolutely miles ahead of any of the built in chart solutions.
The only downside with a dataset this size is that it absolutely pegs my gateway cpu for those 7 seconds.
Edit: Realized I had data decimation on, that is probably helping quite a bit for the display.