Major latency (but fix after reboot)

Hi,

Today, well tonight actually, we had a major incident that reduced the connectivity from gateway to PLC’s.
Unfortunatly it filled the entire logg, so I can not go to the root cause. I think. But I did found this;

No optimized AddressBlock list found for items: [com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@5d2b7cf, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@54d38421, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@133810ef, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@1dd6b9a3, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@daa1919, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@45906601, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@167ca934, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@3ecf01f9, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@63f0b5e8, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@251677f9, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@56504f66, com.inductiveautomation.xopc.server.addressspace.contexts.WriteContext$PendingDriverWrite@10d64c9c]

Can somebody give me an explanation what this is about?

KR,
Steven O

We are indeed using 7.9.
But I don’t know if we are using .16. Where can I find this information?

And, it has been running smoothly for at least 9 months. What could have triggered this?

We are using 7.9.13 (64-bit).

From other similar threads it seems to be tied to the Siemens driver. Either way, it's probably a good idea to upgrade to the latest 7.9 version.

Hi Jordan,

You are correct, we also suspect it to be connected to the Siemens driver. I am planning to update tomorrow afternone when the plan is not operational.

Any idea what could have triggered this?

KR
Steven

I also notice an enormous amount of CPU usage due to Iginition.

image

Maybe it is time for me to contact support…

The changelog specifies multiple writes to the same address. So it could be from anything writing to the PLC. Any new or changed scripts?

I’d contact support after upgrading, because that will be one of the first things they will tell you to try. :wink:

The problem is that our gateway serves multiple plants across Europe. When I asked other plants they all respond that nothing has been changed… :upside_down_face:

I am going to update the version tomorrow, and if this does not help I’ll revert back to Friday last week.

I was able to find the problem. When I make a report, some threads skyrocket.
Threaddump below.
Ignition-HCR-SCADA_thread_dump20220218-113436.txt (166.8 KB)
Ignition-HCR-SCADA_thread_dump20220218-120734.txt (336.6 KB)

1 Like

This is the thread detail regarding the problem;
Thread [webserver-351] id=351, (RUNNABLE)

org.python.core.PyInteger.asPyInteger(PyInteger.java:99)
org.python.core.PyInteger.int_new(PyInteger.java:61)
org.python.core.PyInteger$exposed___new__.new_impl(Unknown Source)
org.python.core.PyType.invokeNew(PyType.java:466)
org.python.core.PyType.type___call__(PyType.java:1558)
org.python.core.PyType.call(PyType.java:1548)
org.python.core.PyObject.call(PyObject.java:387)
org.python.core.PyObject.call(PyObject.java:391)
org.python.pycode._pyx28.PersTabelMeldingen$2(:154)
org.python.pycode._pyx28.call_function()
org.python.core.PyTableCode.call(PyTableCode.java:165)
org.python.core.PyBaseCode.call(PyBaseCode.java:120)
org.python.core.PyFunction.call(PyFunction.java:307)
org.python.pycode.pyx40.updateData$1(:155)
org.python.pycode.pyx40.call_function()
org.python.core.PyTableCode.call(PyTableCode.java:165)
org.python.core.PyBaseCode.call(PyBaseCode.java:301)
org.python.core.PyFunction.function___call
(PyFunction.java:376)
org.python.core.PyFunction.call(PyFunction.java:371)
org.python.core.PyFunction.call(PyFunction.java:361)
org.python.core.PyFunction.call(PyFunction.java:356)
com.inductiveautomation.ignition.common.script.ScriptManager.runFunction(ScriptManager.java:649)
com.inductiveautomation.ignition.common.script.ScriptManager$ScriptFunctionImpl.invoke(ScriptManager.java:759)
com.inductiveautomation.reporting.gateway.data.ScriptReportDataSource.gatherData(ScriptReportDataSource.java:57)
com.inductiveautomation.reporting.gateway.ReportingGatewayHook.getReportData(ReportingGatewayHook.java:338)
com.inductiveautomation.reporting.gateway.ReportingGatewayHook$RPC.getReportData(ReportingGatewayHook.java:529)
sun.reflect.GeneratedMethodAccessor161.invoke(Unknown Source)
sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
java.lang.reflect.Method.invoke(Unknown Source)
com.inductiveautomation.ignition.gateway.servlets.gateway.functions.ModuleInvoke.invoke(ModuleInvoke.java:172)
com.inductiveautomation.ignition.gateway.servlets.Gateway.doPost(Gateway.java:405)
javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
com.inductiveautomation.ignition.gateway.bootstrap.MapServlet.service(MapServlet.java:85)
org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:837)
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160)
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092)
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:52)
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
org.eclipse.jetty.server.Server.handle(Server.java:518)
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:308)
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:244)
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:246)
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:156)
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
java.lang.Thread.run(Unknown Source)

It seems you have a report with a script data source that is burning a lot of CPU while running.

This script would be something you or someone else on your project wrote, so maybe take a closer look at those.

Hi kevin,

Thank you for the response.

I am thinking the same thing. But I do not recall working on any script. Neither does my colleague.
Are the scripts “numbered” by any way in Ignition so I can start looking for anything specific? Letts just say there are a lot of scripts running…

KR,
Steven

It’s not just any script, it’s one that is part of a report: Scripting Data Source - Ignition User Manual 8.1 - Ignition Documentation

Not sure there’s any more detail that can be extracted from your thread dump.

Hi Kevin,

I found the issue, the tag connection to an transaction group was wrong. Actually it was double to the same tagg.

None the less I am updating to Ignition 7.9.16 tomorrow. I have noticed small buggs like not being able to cancell a report while it is running and so on. Maybe these smaller issues in the gateway are resolved.

Thanks everybody for the support!

1 Like