EDIT: This is Ignition EDGE. So no databases can be involved. Otherwise I’d just write to that and be done with it.
I’m doing some stuff with scripting and a dataset tag. I do have pruning in my scripting, but it’s based on a timestamp (everything older than X days ago gets deleted). But if I have a particularly “busy” site where lots of events happen, I’m concerned about having datasets that are so big they affect performance, considering datasets are immutable and when they’re modified they have to be pulled in in their entirety and then recreated with the new data before they’re written back out. Not necessarily a huge deal when there’s a couple hundred rows. But I’m sure at some point (thousands of rows? tens of thousands? millions?) that things could potentially become an issue.
I’m pretty sure I’m going to have to impose some kind of limit on the number of records in these dataset tags, but I don’t really know what that should be, or even where to start. Initially I’m thinking something like, 10,000 rows. Whenever the dataset hits that, the oldest records get pruned off regardless of age. Is that a reasonable limit? Can I go more? Should I not go so high?
Update: I’m running some simulation logic in the PLC project I have that pushes events very frequently. I’m up to almost 1000 rows in the dataset. So far, no increase in my script execution time (ranges from almost 0ms to around 250ms).