Client Performance Strategy


This is going to be a fairly vague question (sorry) but I’m curious if anyone could give me some general strategies for building clients using “advanced” features and a lot of SQL queries that run more efficiently. I have been working on building a client that works as a scheduling tool- currently the screen has about 50 power tables on it. The power tables are using almost all of built in extension functions (row dragging, cell format, header format, popup trigger etc). On a fast computer this works fine, but it drags too much CPU on slower machines the way that I have it built right now and is kind of cumbersome to use.

Is there a good way I can figure out where the slow down is coming from specifically? Ie is it scripting, SQL queries, property bindings, power table extension functions, etc?

I haven’t adjusted the database connection settings from default- are there particular DB connection settings that might help with performance from the client side? As far as I can tell, the DB is not dragging anything down (looking at live values from Gateway DB config I am not able to see any active queries and average duration stays at 0.0s.)

Also, similar question: is it possible to have the user change the SQL poll rate from Absolute(5seconds) to Off on the SQL bindings? I’m imagining putting in a custom property and then linking to the poll rate on the tables, but I’m not sure how to access this SQL poll rate property. This would allow slower machines to turn off “auto refresh” on the tables, which might speed it up a little bit?

I would imagine that if it were something like the popup trigger or any other extension function, you would see the slowdown when the action was supposed to happen. In other words, when you dropped the rows, the table would stay blank for a second and then the action would happen. If this is the case, you could try to use the system.util.invokeAsynchronous function.
Opening the diagnostic window on the client could help find the problem.

I don’t think so. I would encourage you to consolidate your queries as much as possible. So if you have data in two different power tables that could come from one query. Then do one query on the root container and then distribute the data to the different power tables.

Not that I can tell. The options on those bindings look to be statically defined. My question to you would be, is there a need to poll every 5 seconds? Could the table be driven by a date range (in which case the poll could be turned off). Could it be updated every 30 seconds or every minute?

Hope this helps,

We have the same approach with our solutions.

When you mentionned using queries, do you build all of them in the disigner. On our solution, we use SQL Server with stored procedures, event for simple SELECT statements. We then execute them from Ignition.

EXEC [schema].[procedure_name] @parameter={Root}

It could help in the execution since the logic is executed on the server instead of the client itself. You have to make sure your procs are performant though.

We completely deactivated the polling mechanism by switching it to Off. We refresh the controls on certain events. For example, when we create, modify or delete records that are shown in a grid. Then, we call the grid’s data refresh method.

ctrl = ....getComponent('someControlComponent')
system.db.refresh(ctrl, 'data')

If you really want to refresh based on time, you can also use the system’s seconds or minutes with modulus to generate a grigger that is bound to a property on the window (or root container), then use the property_Changed script and call the control’s data refresh method mentionned earlier.

I also try to perform all SQL queries based on events rather than periodic polling. Depending on how much data manipulation I require I may or may not revert to stored procedures, really varies from project to project.

1 Like