Dataset Tag Size or Limits

I'm gettin started with datasets, I have to show data in a power table and have to add or remove rows programaticlly. I viewed dataset tag and got the idea to use a memory tag to put on all information. The idea is to have a power table that is binded with this tag and when the tag updates then the table too. So if I have to add or remove rows, I modify the dataset tag directly.

I’ve seen on this forum that there are no #rows or #columns limit, but there are on size that the tag takes.

¿How can I determinate this size limit or calculate from a dataset memory tag his size?
¿Is this a good implementation?

What is the conceivable maximum amount of data you want to store in this tag? If it’s in the thousands of rows, it’s probably fine. If it’s in the hundreds of thousands of rows or more, it’s probably not.

Also, are you going to be doing many frequent small updates? Dataset updates are whole, not a diff, so each time you update the tag all subscribers have to receive the entire dataset. Again, probably not something you’ll notice with hundreds of rows, but absolutely something you’ll notice with 100,000 rows.

It is in the hundred of rows, in fact much less than that, I need 24 rows, each one for one hour of the day. I have to add or remove a row at least 4 times in an hour. so it makes like 96 rows at maximum, but that is considered for limit purposes only. there will be int and string data and constantly have to be available to change data from the dataset to update it and then save it into the tag and finally see update in at maximum 3 power tables.

I don’t think you would have any problems storing this data in a tag. You may want to put this into the database, for better guarantees about atomic reading and writing, and persistence, but that may be overkill for your needs.

1 Like

¿Is that implementation a good or bad practice? ¿Maybe is there another best way to add/remove rows and update data in a power table or it is a good way to manage 1 dataset tag?

Generally I only use dataset tags for small datasets <100 rows that are statically configured or rarely updated, or where the config is needed for display on igntion Edge e.g. configuration data for example I use them to hold my sequence steps with step numbers and step descriptions as I have a dataset for each (i could use sql here as well, but need to display these in Edge which can’t connect to a database…). Anything else like log data etc. goes to sql

¿Why is it usually dataset not updated? In case that you have to constantly updating data, ¿do you not use a dataset anymore? I have the same issue, I (for the moment) can not connect into a database.

I think you just answered your own question here. Since you don’t have access to a db, then you have to go with a tag.

The concern is about the size of the data possibly overflowing the internal database limits. If it is a small dataset, you’re probably ok. However, If it’s important information, or it changes often, it’s generally preferred to use an external database. That way, if something goes wrong with your Ignition installation or server (like a hard drive failure) the data stays intact.

1 Like

Yes the overflowing is the concern, in future a db will be added so is a temporary solution.

Thank you for the coment!

@PGriffith @jsorlie

Peter, we are having a similar conversation right now i.e. what is the maximum row count a dataset can handle. I guess they can technically handle a lot of rows numerically, however my practical experience is that once they grow to 10’s of thousands of rows, there are some negative things that happen speed wise, in particular the speed of browsing tags in the designer tag browser. Interaction with those tags on the UI also takes time.

We are looking at an application where we would need to store ~3M messages per week and my instinct tells me this is best handled by SQL. But I’m looking at the counter question that has been presented, what are the reasons not to do it with dataset tags?



Because you’d have 3M messages sitting in memory instead of on disk behind a database whey they belong?


Because they’d still go into a database–the internal database in Ignition. Use a proper database instead for such loads.


That’s right, the values would be persisted as well. Double whammy of bad things happening.