Stored Procedure call overhead-way to batch?

We have a data query application that calls a stored procedure within a script, potentially 90 times for a single execution. There appears to be some overhead associated with the creation and execution of the stored procedure, making the script itself take sometimes 10+ seconds.
Without moving away from the stored procedure approach, is there a way to batch these calls to optimize performance?
Generically, we pass a couple of values to the stored proceudre, then write the response to a plc tag. That is currently done via individual system.tag.write operations, and I believe we would get some improvement via switching to system.tag.writeall and batching the writes, but the developer is saying that the bottleneck is the SP call.
Thoughts?

If your data allows it, modify your stored procedure to be able to take a variable number of parameter sets and return a single dataset with all the necessary data back. Have your script build all the parameter sets and pass it in one call to the stored procedure, and then pick the data you need out of the response.

You most certainly would, but your main performance hit is the 90 calls to the same stored procedure.

If you can't change anything else at the very least try to consolidate all your tag writes into a single writeBlocking (or writeAll if you are on ignition 7.9) at the end of your script.

Consider nesting that stored procedure inside a new stored procedure that accepts array parameters. It would call the original SP for each, and assemble the result from each into a row of a response dataset. One call to process many operations.

You'll need a competent database (with UNNEST or equivalent support):

1 Like

Thanks, have tossed the idea to the DB expert. The backend is DB2, I'm not sure if that counts as competent. :person_shrugging:

Yes, competent, according to its documentation.

As long as its JDBC driver doesn't break the arrays on their way to the DB.