Dealing with very large datasets

I remember at one point I saw something here or somewhere online about what to do if you have a very large dataset (hundreds of thousands of rows, a couple columns). I am working out something in a vision window that will eventually go to a memory dataset tag but for now it needs to be in a window. Does anyone here know what I’m talking about?

I’m not sure I understand…

Is it too large to fit in memory?

The problem is generally not holding such a dataset, but getting it to the clients without communication and/or query timeouts. If it’s time-series data in a database, I wrote a module to optimize that. Particularly to support scrolling and zooming when large datasets are charted, but it works for any client-side purpose.
The other issue is that any ordinary dataset stored as a property of a component when the project is saved will cause that project to balloon in size and load very slowly. Current versions of Ignition avoid this with query bindings (unless you check a box to override it), but storing the data anywhere else or binding it to another property will trigger the phenomenon. The Time Series Database Cache module linked above includes “transient” scripting and expression functions that wrap datasets such that their rows are discarded on project save, no matter where they’ve been assigned.