I am currently inserting values into a table (say machine status) and this table will get updated once the status has been changed.
I am using an update query with some sub queries to execute the above. I have a dilemma like whether this logic slows down the gateway (as longest query) or impacts the db.
UPDATE cdcms.tbl_productiondata AS pd
FROM cdcms.tbl_productiondata where EquipmentID= :iEquipmentID order by RunID desc limit 1, 1) As t
SET pd.Actual_OrderEndTime = :iOrderEndTime,
pd.Actual_OrderRunTime = TIME_TO_SEC(TIMEDIFF(:iOrderEndTime, Actual_OrderStartTime)),
pd._modifiedDate = NOW(),
pd._modifiedUser = "ign_upd"
WHERE pd.RunID = t.RunID
AND pd.EquipmentID = t.EquipmentID
AND pd.Actual_OrderEndTime IS NULL
What below logic is recommended to do the mentioned function.
Update Query or StoredProcedure or system.db.runUpdateQuery
Well, no matter how you execute the query, it must be an Update query, so that “logic” doesn’t make since in this context.
You can choose to execute the query from a Named Query, a stored procedure, or one of the scripting functions. Really that comes down to personal preference.
Personally I would use a Named Query, as I like the advantages that they give over the other options.
As to where the impact is, that depends on a few things.
Running queries is a blocking task, which means that if you run one in a place that should be executing as quickly as possible (e.g. Vision GUI thread, Tag Value Change Scripts, etc…) you could very well see the impact of the overall latency. The actual processing impact will occur on the DB, in the unfortunate event that both your DB and Gateway are on the same server then the gateway could also feel that impact.
All of this relies heavily on how your system is set up, the flavor of DB you are using, and how many rows you’re operating on, not to mention any overhead from things like transfer rate.