Does anyone have a procedure that they have developed to ‘clean up’ the sqlt_data_ tables if and when history configuration is dropped for a tag? We are going to either stop collecting on a significant number of tags or remove the tags entirely. I would like to remove the history table entries for those tags. We do not ‘prune’ our history per se by date that is to say we are keeping our history indefinitely. An example of an SQL query that takes the old tag name as a parameter and then drops the unneeded entries would be what I am looking for. Thanks for any help you may offer.
I have had to dig into the history tables Ignition makes quite often.
In short, it is not a straightforward database schema.
The biggest problem in pruning data will be the partitions in your database.
Start here if you are not aware of how the tables are structured:
Basically you need to find each tagid for your particular tagpath and then use those ids as your where clause to select the items to delete in each partition.
I would suggest a few things to be safe:
- Before doing the delete sql statement, use a select statement to verify your where condition.
- Even though it will be more time consuming, I think it would be safer to go one partition at a time so you know exactly what you are deleting.
- Once you purge records from every partition you can delete the ids from sqlth_sce.
Thank you Brandon1. We are also now dealing with an issue of two tags which have in one case two thousand sce entries e.g. tagid’s. It is causing some issues with queries timing out and also potentially linked to occasional deadlocks on the database. I think the approach you indicated for tag deletion could be modified to perform table updates to ‘clean up’ references to tagid and sce table so that the historical queries which had two thousand ‘?’ would actually run instead of time out. thanks much.
If you have a significant number of rows to remove, you should write the query in such a way that the deletions are batched.
DECLARE @batch as INT DECLARE @i as INT DECLARE @iter_max as INT SET @batch = 500 --maximum number of rows to remove SET @i = 0 SET @iter_max = 100 --maximum number of batches to remove BEGIN while (1=1) Begin --DELETE TOP (@batch) FROM your_table IF (@@rowcount < @batch or @i = @iter_max) Begin BREAK; end SET @i = @i + 1; end end
Since you are limiting the number of rows that are being deleted, and running multiple queries instead of a single query, this type of logic will prevent deadlocks.
I would also recommend doing this from your database management software (e.g. ssms for MSSQL).
Another thing that will help is insuring that the historian tables you are working with are indexed properly, in some cases they are not out of the box.