Perspective Table Performance Issue for Large Dataset

I wonder why we see performance issues when we have a dataset size of more than 3000 rows in Perspective Table which causes the page to freeze, gateway crash, and so on, but the SQL IDE tools can simply read and display more than 20000 rows easily.
What is the mean difference between these two?

2 Likes

Simple. SQL IDE like Datagrip or the native tools for a particular DB platform don't convert everything to JSON.

But QuestDB IDE which web base does that.
DBeaver Docker version is also web base and use JSON
Also, I remember tables or easychart in Vision have same problem when the dataset is more than 1500 rows.

It is probably not converting everything to JSON, but delivering optimized HTML to directly populate the table.

Uhm, no. I've delivered many tens of thousands of rows to Vision tables without problems, and millions of rows to Vision charts. (Typically need my TS DB Cache module to move such large datasets, but Vision has no problem displaying them.)

Is the Table virtualized turned on? Probably not for your use cases, but I find that that helps. Using paging as well...

2 Likes

I wonder how did you load millions of rows in Vision. One thing I saw every time a client loads something big like 3000 rows the RAM usage increases a lot and when the user closes the window the RAM doesn't decrease or decrease a little bit so eventually one day gateway crashes.
how much Ignition's RAM do you have?

1 Like

You have to chunk the queries. Vision can hold the chunks (and assemble into a full dataset) while the gateway lets them be garbage collected after transmission. Millions of rows in a dataset yields tens or a few hundreds of megabytes in client RAM, which is easily handled by client RAM configuration.

(My Time Series Database Cache Module does this chunking, transmission, and reassembly for you.)

Vision's client RAM usage is totally independent of gateway RAM.

2 Likes