Tags getting frozen due to Store and Forward

Hi,
The tags are getting frozen due to being quarantined in the Store and Forward.
I didn’t have much knowledge on Store and Forward , for now I have increased memory buffer size.
Now I deleted the quarantine data and tags became unfrozen.

What to do for this, please guide me.

Thanks!

What do you mean by this? Do you mean your trends?

Here I am storing History data in DB.
here is screenshot of error.

java.lang.RuntimeException: java.sql.BatchUpdateException: Violation of PRIMARY KEY constraint 'PK__sqlt_dat__BE126DD1975C00AB'. Cannot insert duplicate key in object 'dbo.sqlt_data_1_2025_01'. The duplicate key value is (63, 1737730046000).

at com.inductiveautomation.ignition.common.functional.FragileConsumer.lambda$wrap$1(FragileConsumer.java:47)

at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(Unknown Source)

at java.base/java.util.stream.ReferencePipeline$2$1.accept(Unknown Source)

at java.base/java.util.stream.ReferencePipeline$3$1.accept(Unknown Source)

at java.base/java.util.stream.ReferencePipeline$2$1.accept(Unknown Source)

at java.base/java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Unknown Source)

at java.base/java.util.stream.ReferencePipeline$Head.forEach(Unknown Source)

at java.base/java.util.stream.ReferencePipeline$7$1.accept(Unknown Source)

at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(Unknown Source)

at java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)

at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)

at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(Unknown Source)

at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(Unknown Source)

at java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)

at java.base/java.util.stream.ReferencePipeline.forEach(Unknown Source)

at com.inductiveautomation.ignition.gateway.history.sf.sinks.AbstractDatasourceSink.storeToDatasource(AbstractDatasourceSink.java:186)

at com.inductiveautomation.ignition.gateway.history.sf.sinks.AbstractDatasourceSink.storeData(AbstractDatasourceSink.java:156)

at com.inductiveautomation.ignition.gateway.history.sf.sinks.AggregateSink.storeData(AggregateSink.java:180)

at com.inductiveautomation.ignition.gateway.history.forwarders.ForwarderThread.run(ForwarderThread.java:147)

Caused by: java.sql.BatchUpdateException: Violation of PRIMARY KEY constraint 'PK__sqlt_dat__BE126DD1975C00AB'. Cannot insert duplicate key in object 'dbo.sqlt_data_1_2025_01'. The duplicate key value is (63, 1737730046000).

at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeBatch(SQLServerPreparedStatement.java:2101)

at org.apache.commons.dbcp2.DelegatingStatement.executeBatch(DelegatingStatement.java:242)

at org.apache.commons.dbcp2.DelegatingStatement.executeBatch(DelegatingStatement.java:242)

at com.inductiveautomation.ignition.gateway.datasource.DelegatingStatement.executeBatch(DelegatingStatement.java:60)

at com.inductiveautomation.ignition.gateway.datasource.SRConnectionWrapper$SRStatement.executeBatch(SRConnectionWrapper.java:767)

at com.inductiveautomation.gateway.tags.history.storage.TagHistoryDatasourceSink$BatchedTagInsert.onComplete(TagHistoryDatasourceSink.java:1296)

at com.inductiveautomation.gateway.tags.history.storage.TagHistoryDatasourceSink.insertTagValues(TagHistoryDatasourceSink.java:1088)

at com.inductiveautomation.gateway.tags.history.storage.TagHistoryDatasourceSink.storeScanClassSet(TagHistoryDatasourceSink.java:550)

at com.inductiveautomation.gateway.tags.history.storage.TagHistoryDatasourceSink.storeDataToDatasource(TagHistoryDatasourceSink.java:525)

at com.inductiveautomation.ignition.gateway.history.sf.sinks.AbstractDatasourceSink.lambda$storeToDatasource$2(AbstractDatasourceSink.java:186)

at com.inductiveautomation.ignition.common.functional.FragileConsumer.lambda$wrap$1(FragileConsumer.java:45)

That the error... so something is trying to double log data.

How do you have your historian setup? Database connections?
Does this happen routinely?

yes, connection with SQL server.
History is updating daily.

Does this happen often?

How do you have your database connections setup? Can you show that?
Is this a redundant Gateway System?
Are you using the Ignition Historian, or are you scripting adding data to the database?

yes it happens more frequently.
No, It is not redundant system.
we have opc tags history enabled, through this we are storing the data I think.

Can you show how your database connection is configured? I've seen this when a datasource is being used for the historian, and there is a fail over datasource configured rather than using the Tag Splitter.

Can you show a screen shot of the Config -> Tags -> History page? That will help understand how the Tag Historian is setup as well.

I dint see any failover datasource.

That's the tag history configuration.
Can you check the Config -> Database -> Connections -> ARTIE_HISTORY configuration? That is where the failover datasource would be located.

I checked there, not failover datasource is assigned.

You may want to reach out to support directly and have them take a look. Everything looks setup correctly with no red flags, so either there is an issue in the database directly, or there is logging happening from a different source creating those issues.

okay will check with them.
For now, is there any work around I can do, like some script checking tags are frozen if yes then it will try to push the data to DB and then delete the quarantine data.

Not that I'm aware of.

how to raise ticket for support?

https://support.inductiveautomation.com is typically where I start.

Note... this forum isn't official support.

1 Like

Are the historical tags configured on the local gateway, or are they configured on a remote gateway?

In general, this means that the historical data was already inserted into the database previously, but something went wrong during the acknowledgement of the data inserted. As a result, data is requested to be inserted again. the duplicate entry error you're seeing appears to be an expensive operation. This can be the result of poor network connection between remote gateways & DB, among others.
+1 for IA support. If you have a support agreement - this would be a great time to use it. They might recommend other settings to optimize your S&F engine, as well as help troubleshoot the underlying issue.

1 Like

it is on local gateway