Transaction made tables vs scripted tables

I am getting different results from making transaction tables using the OPC browser, and my colleague’s table that he made scripting.

In his script, he created a pydataset as a tag I believe.

have the tradeoffs been discussed much?
Or is one decidedly better?

I think he timed them both, and the script method was faster.

Not sure what you mean by

The OPC browser is read only

if I make a transaction group, then the group stores that data into a table on the trigger

this other method, my friend is writing as script that sets all the data into a dataset that is a tag

I guess I meant specifically when I make the transaction group, I use the opc browser to pick the tags.

So you’re talking about comparing dataset tags vs SQL tables?

yes

I am new and wonder what the tradeoffs are

SQL tables are meant for storing and retrieving large amounts of data. I would always trust sql over dataset tags for critical data. You can also report on sql table data far more easily than a dataset tag data as you can use sql queries and achieve more complex criteria in less code (compared to using python). I still use dataset tags, but only for configuration of things like for example storing the steps of one sequence (50 or so rows). I could store this in a sql table as well, but you also need to consider where the data will be used. If it’s in Ignition Edge, they can’t access sql data at all. I want edge clients to be able to see sequence steps, so my only option is dataset tags. I also have a dataset tag that stores a lookup of udt type against popup window to open

3 Likes

With SQL, you are dealing with rows. If you add a row, you’re done. To add a row to a dataset, you are creating a whole new dataset. There is no differential.

1 Like

Thanks again guys

In addition to everything @nminchin said which is true, storing data into a database also allows other users who have access to that database (likely the IT department or adjacent) to also query and do with the data what they want, without fear that they would mess up anything in your Ignition application.

In the case of a dataset tag, the only way someone else could view that would be either in your client (if you’re displaying it somehow), or if they open designer (which is probably not something you want anyone who’s not an integrator to do).

2 Likes

Also, regarding speed, I don’t know how then transaction manager works, but it may send its insert query to the store and forward system for execution which would see the new row appearing only once the s+f system has processed it

1 Like

I believe the time trial was like 7ms vs 15ms

but if I understand correctly the time for the dataset tag is proportional to the current size and will climb in time to execute the large the table becomes

3 Likes

One other thing I will say is that I trust Ignition’s coding more than my own, so in your scenario. I would trust the Transaction groups more than something I coded myself to do the same thing. Though if its just a INSERT query its most likely not going to throw you any crazy errors.

Also, until this thing is up and running, trying to prematurely optimize this might end up being a waste of time. Until you get real data flowing at the production rate you’d expect, it can be hard to really know where the bottlenecks are.

Edit: I would not sweat a 8 ms difference.

The other thing you cns do of course instead of using the transaction thing is to script inserting rows into the sql table. I’ve never used the transaction manager. I never took the time to understand it :sweat_smile: and find scripting it easier and far more customisable

Whereas it's what I started with in the FactorySQL days. lol.

2 Likes